Oct 01 12:53:10 crc systemd[1]: Starting Kubernetes Kubelet... Oct 01 12:53:10 crc restorecon[4664]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:10 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:53:11 crc restorecon[4664]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:53:11 crc restorecon[4664]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 01 12:53:12 crc kubenswrapper[4851]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 12:53:12 crc kubenswrapper[4851]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 01 12:53:12 crc kubenswrapper[4851]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 12:53:12 crc kubenswrapper[4851]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 12:53:12 crc kubenswrapper[4851]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 01 12:53:12 crc kubenswrapper[4851]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.069263 4851 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072798 4851 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072815 4851 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072821 4851 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072827 4851 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072832 4851 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072840 4851 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072846 4851 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072851 4851 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072856 4851 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072868 4851 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072875 4851 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072881 4851 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072886 4851 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072892 4851 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072897 4851 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072902 4851 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072908 4851 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072912 4851 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072917 4851 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072923 4851 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072929 4851 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072934 4851 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072939 4851 feature_gate.go:330] unrecognized feature gate: Example Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072946 4851 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072951 4851 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072955 4851 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072960 4851 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072966 4851 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072971 4851 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072976 4851 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072981 4851 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072986 4851 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072991 4851 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.072996 4851 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073001 4851 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073008 4851 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073015 4851 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073022 4851 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073028 4851 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073034 4851 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073039 4851 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073045 4851 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073050 4851 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073055 4851 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073060 4851 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073066 4851 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073071 4851 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073076 4851 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073081 4851 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073086 4851 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073091 4851 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073096 4851 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073101 4851 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073106 4851 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073112 4851 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073117 4851 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073124 4851 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073131 4851 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073137 4851 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073142 4851 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073147 4851 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073152 4851 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073158 4851 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073163 4851 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073168 4851 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073173 4851 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073178 4851 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073183 4851 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073188 4851 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073194 4851 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.073200 4851 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073824 4851 flags.go:64] FLAG: --address="0.0.0.0" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073840 4851 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073854 4851 flags.go:64] FLAG: --anonymous-auth="true" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073862 4851 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073870 4851 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073876 4851 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073884 4851 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073892 4851 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073898 4851 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073904 4851 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073911 4851 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073918 4851 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073924 4851 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073930 4851 flags.go:64] FLAG: --cgroup-root="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073937 4851 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073943 4851 flags.go:64] FLAG: --client-ca-file="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073948 4851 flags.go:64] FLAG: --cloud-config="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073954 4851 flags.go:64] FLAG: --cloud-provider="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073960 4851 flags.go:64] FLAG: --cluster-dns="[]" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073968 4851 flags.go:64] FLAG: --cluster-domain="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073973 4851 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073980 4851 flags.go:64] FLAG: --config-dir="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073986 4851 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.073993 4851 flags.go:64] FLAG: --container-log-max-files="5" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074001 4851 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074007 4851 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074013 4851 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074019 4851 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074026 4851 flags.go:64] FLAG: --contention-profiling="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074032 4851 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074038 4851 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074045 4851 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074051 4851 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074059 4851 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074065 4851 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074072 4851 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074078 4851 flags.go:64] FLAG: --enable-load-reader="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074084 4851 flags.go:64] FLAG: --enable-server="true" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074090 4851 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074099 4851 flags.go:64] FLAG: --event-burst="100" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074105 4851 flags.go:64] FLAG: --event-qps="50" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074111 4851 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074116 4851 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074122 4851 flags.go:64] FLAG: --eviction-hard="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074130 4851 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074136 4851 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074142 4851 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074148 4851 flags.go:64] FLAG: --eviction-soft="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074154 4851 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074160 4851 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074184 4851 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074192 4851 flags.go:64] FLAG: --experimental-mounter-path="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074198 4851 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074204 4851 flags.go:64] FLAG: --fail-swap-on="true" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074210 4851 flags.go:64] FLAG: --feature-gates="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074221 4851 flags.go:64] FLAG: --file-check-frequency="20s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074227 4851 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074233 4851 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074239 4851 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074246 4851 flags.go:64] FLAG: --healthz-port="10248" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074252 4851 flags.go:64] FLAG: --help="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074258 4851 flags.go:64] FLAG: --hostname-override="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074264 4851 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074270 4851 flags.go:64] FLAG: --http-check-frequency="20s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074277 4851 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074283 4851 flags.go:64] FLAG: --image-credential-provider-config="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074289 4851 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074295 4851 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074301 4851 flags.go:64] FLAG: --image-service-endpoint="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074306 4851 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074312 4851 flags.go:64] FLAG: --kube-api-burst="100" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074320 4851 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074326 4851 flags.go:64] FLAG: --kube-api-qps="50" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074332 4851 flags.go:64] FLAG: --kube-reserved="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074338 4851 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074344 4851 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074350 4851 flags.go:64] FLAG: --kubelet-cgroups="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074356 4851 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074362 4851 flags.go:64] FLAG: --lock-file="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074367 4851 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074373 4851 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074379 4851 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074388 4851 flags.go:64] FLAG: --log-json-split-stream="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074394 4851 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074400 4851 flags.go:64] FLAG: --log-text-split-stream="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074407 4851 flags.go:64] FLAG: --logging-format="text" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074412 4851 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074419 4851 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074424 4851 flags.go:64] FLAG: --manifest-url="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074432 4851 flags.go:64] FLAG: --manifest-url-header="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074440 4851 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074446 4851 flags.go:64] FLAG: --max-open-files="1000000" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074454 4851 flags.go:64] FLAG: --max-pods="110" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074459 4851 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074465 4851 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074472 4851 flags.go:64] FLAG: --memory-manager-policy="None" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074480 4851 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074486 4851 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074492 4851 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074514 4851 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074528 4851 flags.go:64] FLAG: --node-status-max-images="50" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074535 4851 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074541 4851 flags.go:64] FLAG: --oom-score-adj="-999" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074547 4851 flags.go:64] FLAG: --pod-cidr="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074553 4851 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074562 4851 flags.go:64] FLAG: --pod-manifest-path="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074568 4851 flags.go:64] FLAG: --pod-max-pids="-1" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074575 4851 flags.go:64] FLAG: --pods-per-core="0" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074581 4851 flags.go:64] FLAG: --port="10250" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074588 4851 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074594 4851 flags.go:64] FLAG: --provider-id="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074600 4851 flags.go:64] FLAG: --qos-reserved="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074606 4851 flags.go:64] FLAG: --read-only-port="10255" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074613 4851 flags.go:64] FLAG: --register-node="true" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074618 4851 flags.go:64] FLAG: --register-schedulable="true" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074624 4851 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074635 4851 flags.go:64] FLAG: --registry-burst="10" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074641 4851 flags.go:64] FLAG: --registry-qps="5" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074646 4851 flags.go:64] FLAG: --reserved-cpus="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074653 4851 flags.go:64] FLAG: --reserved-memory="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074660 4851 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074667 4851 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074673 4851 flags.go:64] FLAG: --rotate-certificates="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074679 4851 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074684 4851 flags.go:64] FLAG: --runonce="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074691 4851 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074697 4851 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074703 4851 flags.go:64] FLAG: --seccomp-default="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074709 4851 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074716 4851 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074722 4851 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074728 4851 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074735 4851 flags.go:64] FLAG: --storage-driver-password="root" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074741 4851 flags.go:64] FLAG: --storage-driver-secure="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074747 4851 flags.go:64] FLAG: --storage-driver-table="stats" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074752 4851 flags.go:64] FLAG: --storage-driver-user="root" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074759 4851 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074765 4851 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074771 4851 flags.go:64] FLAG: --system-cgroups="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074777 4851 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074785 4851 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074792 4851 flags.go:64] FLAG: --tls-cert-file="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074798 4851 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074807 4851 flags.go:64] FLAG: --tls-min-version="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074813 4851 flags.go:64] FLAG: --tls-private-key-file="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074819 4851 flags.go:64] FLAG: --topology-manager-policy="none" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074824 4851 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074831 4851 flags.go:64] FLAG: --topology-manager-scope="container" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074836 4851 flags.go:64] FLAG: --v="2" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074845 4851 flags.go:64] FLAG: --version="false" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074852 4851 flags.go:64] FLAG: --vmodule="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074859 4851 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.074865 4851 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.074999 4851 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075005 4851 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075011 4851 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075017 4851 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075023 4851 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075030 4851 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075036 4851 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075041 4851 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075047 4851 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075053 4851 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075058 4851 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075063 4851 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075068 4851 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075074 4851 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075080 4851 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075085 4851 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075091 4851 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075096 4851 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075103 4851 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075110 4851 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075116 4851 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075121 4851 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075127 4851 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075132 4851 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075137 4851 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075142 4851 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075149 4851 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075154 4851 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075158 4851 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075163 4851 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075168 4851 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075174 4851 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075179 4851 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075184 4851 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075189 4851 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075194 4851 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075199 4851 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075204 4851 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075211 4851 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075216 4851 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075222 4851 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075227 4851 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075232 4851 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075237 4851 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075242 4851 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075247 4851 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075252 4851 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075259 4851 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075266 4851 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075272 4851 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075277 4851 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075282 4851 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075287 4851 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075292 4851 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075297 4851 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075302 4851 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075307 4851 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075312 4851 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075317 4851 feature_gate.go:330] unrecognized feature gate: Example Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075322 4851 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075327 4851 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075332 4851 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075339 4851 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075344 4851 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075349 4851 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075354 4851 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075359 4851 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075364 4851 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075369 4851 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075375 4851 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.075380 4851 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.075931 4851 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.089027 4851 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.089075 4851 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089418 4851 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089440 4851 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089451 4851 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089460 4851 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089470 4851 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089478 4851 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089487 4851 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089517 4851 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089526 4851 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089534 4851 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089542 4851 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089550 4851 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089558 4851 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089570 4851 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089581 4851 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089590 4851 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089598 4851 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089608 4851 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089617 4851 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089625 4851 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089634 4851 feature_gate.go:330] unrecognized feature gate: Example Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089642 4851 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089652 4851 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089663 4851 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089673 4851 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089681 4851 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089689 4851 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089698 4851 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089707 4851 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089716 4851 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089724 4851 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089733 4851 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089741 4851 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089749 4851 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089757 4851 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089765 4851 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089773 4851 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089780 4851 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089788 4851 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089796 4851 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089804 4851 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089812 4851 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089820 4851 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089828 4851 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089839 4851 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089848 4851 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089856 4851 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089864 4851 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089871 4851 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089879 4851 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089887 4851 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089895 4851 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089903 4851 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089912 4851 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089920 4851 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089931 4851 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089941 4851 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089950 4851 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089960 4851 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089967 4851 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089976 4851 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089984 4851 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.089993 4851 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090001 4851 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090008 4851 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090016 4851 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090024 4851 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090032 4851 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090040 4851 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090048 4851 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090056 4851 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.090069 4851 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090309 4851 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090323 4851 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090332 4851 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090341 4851 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090349 4851 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090358 4851 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090366 4851 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090374 4851 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090382 4851 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090390 4851 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090398 4851 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090405 4851 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090413 4851 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090423 4851 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090432 4851 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090440 4851 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090448 4851 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090456 4851 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090464 4851 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090472 4851 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090480 4851 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090488 4851 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090495 4851 feature_gate.go:330] unrecognized feature gate: Example Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090535 4851 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090546 4851 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090555 4851 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090565 4851 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090574 4851 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090582 4851 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090591 4851 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090598 4851 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090606 4851 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090614 4851 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090622 4851 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090630 4851 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090637 4851 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090648 4851 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090657 4851 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090665 4851 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090674 4851 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090682 4851 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090690 4851 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090698 4851 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090706 4851 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090714 4851 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090723 4851 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090731 4851 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090739 4851 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090747 4851 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090755 4851 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090762 4851 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090770 4851 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090778 4851 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090787 4851 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090795 4851 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090803 4851 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090811 4851 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090818 4851 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090826 4851 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090833 4851 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090842 4851 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090849 4851 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090857 4851 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090865 4851 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090872 4851 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090880 4851 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090888 4851 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090896 4851 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090906 4851 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090916 4851 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.090925 4851 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.090937 4851 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.091948 4851 server.go:940] "Client rotation is on, will bootstrap in background" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.099755 4851 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.099923 4851 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.101949 4851 server.go:997] "Starting client certificate rotation" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.102008 4851 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.102228 4851 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-28 18:30:34.910663022 +0000 UTC Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.102337 4851 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2117h37m22.808331383s for next certificate rotation Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.131959 4851 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.138245 4851 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.166016 4851 log.go:25] "Validated CRI v1 runtime API" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.204150 4851 log.go:25] "Validated CRI v1 image API" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.206145 4851 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.213780 4851 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-01-12-48-13-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.214056 4851 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.238964 4851 manager.go:217] Machine: {Timestamp:2025-10-01 12:53:12.236702141 +0000 UTC m=+0.581819697 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:676661b5-1208-494a-8f95-09660995f514 BootID:e8d9204a-cae6-4011-96df-416039ccb8ba Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:db:30:f7 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:db:30:f7 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e7:b4:a6 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:99:a4:ee Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:85:1a:df Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ca:4c:83 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:92:bf:58:00:d0:10 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:be:9f:6e:30:9e:73 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.239315 4851 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.239684 4851 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.241446 4851 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.242393 4851 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.242687 4851 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.243034 4851 topology_manager.go:138] "Creating topology manager with none policy" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.243061 4851 container_manager_linux.go:303] "Creating device plugin manager" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.243695 4851 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.243750 4851 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.243988 4851 state_mem.go:36] "Initialized new in-memory state store" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.244488 4851 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.248728 4851 kubelet.go:418] "Attempting to sync node with API server" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.248787 4851 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.248824 4851 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.248845 4851 kubelet.go:324] "Adding apiserver pod source" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.248864 4851 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.253375 4851 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.254701 4851 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.256482 4851 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.256494 4851 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Oct 01 12:53:12 crc kubenswrapper[4851]: E1001 12:53:12.256630 4851 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:53:12 crc kubenswrapper[4851]: E1001 12:53:12.256645 4851 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.258221 4851 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.259687 4851 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.259730 4851 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.259747 4851 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.259762 4851 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.259787 4851 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.259802 4851 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.259819 4851 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.259843 4851 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.259860 4851 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.259877 4851 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.259897 4851 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.259912 4851 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.260784 4851 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.261583 4851 server.go:1280] "Started kubelet" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.261705 4851 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.261999 4851 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.262752 4851 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 01 12:53:12 crc systemd[1]: Started Kubernetes Kubelet. Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.263305 4851 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.264596 4851 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.264637 4851 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.264725 4851 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.264743 4851 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 01 12:53:12 crc kubenswrapper[4851]: E1001 12:53:12.264808 4851 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.264853 4851 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.264697 4851 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 07:31:08.715571817 +0000 UTC Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.264877 4851 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2250h37m56.450700144s for next certificate rotation Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.265877 4851 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Oct 01 12:53:12 crc kubenswrapper[4851]: E1001 12:53:12.265976 4851 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:53:12 crc kubenswrapper[4851]: E1001 12:53:12.267803 4851 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="200ms" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.268107 4851 factory.go:55] Registering systemd factory Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.268154 4851 factory.go:221] Registration of the systemd container factory successfully Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.268663 4851 factory.go:153] Registering CRI-O factory Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.268698 4851 factory.go:221] Registration of the crio container factory successfully Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.268828 4851 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.268882 4851 factory.go:103] Registering Raw factory Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.268915 4851 manager.go:1196] Started watching for new ooms in manager Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.269949 4851 server.go:460] "Adding debug handlers to kubelet server" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.270158 4851 manager.go:319] Starting recovery of all containers Oct 01 12:53:12 crc kubenswrapper[4851]: E1001 12:53:12.281350 4851 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.251:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a5f1f3fd9abf3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-01 12:53:12.261454835 +0000 UTC m=+0.606572361,LastTimestamp:2025-10-01 12:53:12.261454835 +0000 UTC m=+0.606572361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.287827 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.289667 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.289709 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.289725 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.289747 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.289805 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.289823 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.289839 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.289859 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.289874 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.289890 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.289906 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.291774 4851 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.291807 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.291830 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.291846 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.291861 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.291876 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.291921 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.291935 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.291947 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.291960 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.291978 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.291992 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292004 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292019 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292031 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292050 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292066 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292080 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292094 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292108 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292123 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292148 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292162 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292174 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292186 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292199 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292214 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292226 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292238 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292253 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292265 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292277 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292288 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292303 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292318 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292356 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292371 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292385 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292399 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292412 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292426 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292445 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292459 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292473 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292488 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292522 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292535 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292552 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292565 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292578 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292590 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292604 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292616 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292628 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292642 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292654 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292665 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292677 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292690 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292703 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292716 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292730 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292749 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292762 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292779 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292793 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292806 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292819 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292833 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292846 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292860 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292874 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292890 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292904 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292916 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292929 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292943 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292957 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292968 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292982 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.292996 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293009 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293024 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293036 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293049 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293062 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293086 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293098 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293111 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293126 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293139 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293152 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293164 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293183 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293199 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293212 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293228 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293243 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293258 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293273 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293290 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293304 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293319 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293332 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293347 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293363 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293375 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293390 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293406 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293419 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293434 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293450 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293464 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293477 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293492 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293524 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293538 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293551 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293568 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293584 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293597 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293611 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293624 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293637 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293650 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293662 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293675 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293688 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293701 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293714 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293728 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293741 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293754 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293768 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293783 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293796 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293809 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293825 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293839 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293852 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293866 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293880 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293895 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293909 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293923 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293937 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293952 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293971 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293984 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.293999 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294014 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294027 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294040 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294056 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294106 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294121 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294135 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294198 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294214 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294228 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294242 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294258 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294273 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294290 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294306 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294321 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294338 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294352 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294367 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294381 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294394 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294408 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294423 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294438 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294454 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294471 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294487 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294523 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294539 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294554 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294570 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294585 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294602 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294617 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294633 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294649 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294662 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294677 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294692 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294707 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294722 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294737 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294751 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294766 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294780 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294796 4851 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294807 4851 reconstruct.go:97] "Volume reconstruction finished" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.294816 4851 reconciler.go:26] "Reconciler: start to sync state" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.300013 4851 manager.go:324] Recovery completed Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.310449 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.313091 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.313145 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.313160 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.313982 4851 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.314000 4851 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.314046 4851 state_mem.go:36] "Initialized new in-memory state store" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.324900 4851 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.326540 4851 policy_none.go:49] "None policy: Start" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.327041 4851 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.327087 4851 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.327116 4851 kubelet.go:2335] "Starting kubelet main sync loop" Oct 01 12:53:12 crc kubenswrapper[4851]: E1001 12:53:12.327165 4851 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.329675 4851 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.329710 4851 state_mem.go:35] "Initializing new in-memory state store" Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.329693 4851 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Oct 01 12:53:12 crc kubenswrapper[4851]: E1001 12:53:12.329775 4851 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:53:12 crc kubenswrapper[4851]: E1001 12:53:12.365346 4851 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.390865 4851 manager.go:334] "Starting Device Plugin manager" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.390961 4851 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.391629 4851 server.go:79] "Starting device plugin registration server" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.392269 4851 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.392339 4851 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.392878 4851 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.393007 4851 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.393027 4851 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 01 12:53:12 crc kubenswrapper[4851]: E1001 12:53:12.409041 4851 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.428223 4851 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.428388 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.429838 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.429908 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.429929 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.430196 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.430481 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.430571 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.432145 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.432201 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.432222 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.432440 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.433312 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.433411 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.433771 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.433833 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.433854 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.434194 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.434316 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.434370 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.435253 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.435297 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.435314 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.435545 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.435581 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.435598 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.435808 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.435835 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.435856 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.436145 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.436580 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.436641 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.437846 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.437874 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.437893 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.437936 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.437964 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.437980 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.438226 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.438266 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.438292 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.438828 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.438880 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.441160 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.441185 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.441198 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:12 crc kubenswrapper[4851]: E1001 12:53:12.468626 4851 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="400ms" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.492994 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.494082 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.494131 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.494149 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.494182 4851 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:53:12 crc kubenswrapper[4851]: E1001 12:53:12.494692 4851 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.251:6443: connect: connection refused" node="crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.495875 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.495929 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.495960 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.495984 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.496007 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.496067 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.496138 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.496182 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.496210 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.496236 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.496258 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.496279 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.496302 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.496382 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.496437 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.597721 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.597776 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.597809 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.597851 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.597898 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.597970 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.597996 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.597972 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598036 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598091 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598132 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598038 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598097 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598056 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598229 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598262 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598290 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598320 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598347 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598369 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598410 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598362 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598352 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598445 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598513 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598539 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598566 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598580 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598596 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.598697 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.695863 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.698013 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.698107 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.698200 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.698277 4851 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:53:12 crc kubenswrapper[4851]: E1001 12:53:12.699051 4851 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.251:6443: connect: connection refused" node="crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.769552 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.789650 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.802020 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.828828 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-e874face2b5dd6eff4cfb5886a9c0ed429b130e7f491918b7f799eccb54e5684 WatchSource:0}: Error finding container e874face2b5dd6eff4cfb5886a9c0ed429b130e7f491918b7f799eccb54e5684: Status 404 returned error can't find the container with id e874face2b5dd6eff4cfb5886a9c0ed429b130e7f491918b7f799eccb54e5684 Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.832710 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3b543b3fab3ab19c71ce105ec2a840a2c2ef0746220726439e35e78fa41e2c49 WatchSource:0}: Error finding container 3b543b3fab3ab19c71ce105ec2a840a2c2ef0746220726439e35e78fa41e2c49: Status 404 returned error can't find the container with id 3b543b3fab3ab19c71ce105ec2a840a2c2ef0746220726439e35e78fa41e2c49 Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.833777 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-34c4c47db8b7eaad80bd413d8e457a5e2a82838c45fc01cfbb9f1e5132294ef2 WatchSource:0}: Error finding container 34c4c47db8b7eaad80bd413d8e457a5e2a82838c45fc01cfbb9f1e5132294ef2: Status 404 returned error can't find the container with id 34c4c47db8b7eaad80bd413d8e457a5e2a82838c45fc01cfbb9f1e5132294ef2 Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.851568 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: I1001 12:53:12.866343 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:12 crc kubenswrapper[4851]: E1001 12:53:12.870010 4851 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="800ms" Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.876928 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-18e5f331292c41803519ece449ed642e6ffc713c7aa023bb532c333849834b1d WatchSource:0}: Error finding container 18e5f331292c41803519ece449ed642e6ffc713c7aa023bb532c333849834b1d: Status 404 returned error can't find the container with id 18e5f331292c41803519ece449ed642e6ffc713c7aa023bb532c333849834b1d Oct 01 12:53:12 crc kubenswrapper[4851]: W1001 12:53:12.883107 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-30ec92bc39de120d0e2fa327b109501e3d0035d2ee7fa61edc5a70b33fc6d022 WatchSource:0}: Error finding container 30ec92bc39de120d0e2fa327b109501e3d0035d2ee7fa61edc5a70b33fc6d022: Status 404 returned error can't find the container with id 30ec92bc39de120d0e2fa327b109501e3d0035d2ee7fa61edc5a70b33fc6d022 Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.099841 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.101956 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.102004 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.102013 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.102044 4851 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:53:13 crc kubenswrapper[4851]: E1001 12:53:13.102582 4851 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.251:6443: connect: connection refused" node="crc" Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.264314 4851 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.332994 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e874face2b5dd6eff4cfb5886a9c0ed429b130e7f491918b7f799eccb54e5684"} Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.333853 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"30ec92bc39de120d0e2fa327b109501e3d0035d2ee7fa61edc5a70b33fc6d022"} Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.334613 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"18e5f331292c41803519ece449ed642e6ffc713c7aa023bb532c333849834b1d"} Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.335293 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"34c4c47db8b7eaad80bd413d8e457a5e2a82838c45fc01cfbb9f1e5132294ef2"} Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.336653 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3b543b3fab3ab19c71ce105ec2a840a2c2ef0746220726439e35e78fa41e2c49"} Oct 01 12:53:13 crc kubenswrapper[4851]: W1001 12:53:13.400135 4851 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Oct 01 12:53:13 crc kubenswrapper[4851]: E1001 12:53:13.400228 4851 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:53:13 crc kubenswrapper[4851]: W1001 12:53:13.422737 4851 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Oct 01 12:53:13 crc kubenswrapper[4851]: E1001 12:53:13.422877 4851 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:53:13 crc kubenswrapper[4851]: W1001 12:53:13.660616 4851 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Oct 01 12:53:13 crc kubenswrapper[4851]: E1001 12:53:13.660715 4851 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:53:13 crc kubenswrapper[4851]: E1001 12:53:13.670838 4851 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="1.6s" Oct 01 12:53:13 crc kubenswrapper[4851]: W1001 12:53:13.754037 4851 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Oct 01 12:53:13 crc kubenswrapper[4851]: E1001 12:53:13.754112 4851 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.903574 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.905097 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.905137 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.905153 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:13 crc kubenswrapper[4851]: I1001 12:53:13.905183 4851 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:53:13 crc kubenswrapper[4851]: E1001 12:53:13.905788 4851 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.251:6443: connect: connection refused" node="crc" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.264897 4851 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.341873 4851 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4" exitCode=0 Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.341962 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4"} Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.341990 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.343166 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.343201 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.343213 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.345598 4851 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061" exitCode=0 Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.345668 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061"} Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.345687 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.346663 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.346699 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.346711 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.350312 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e"} Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.350349 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d"} Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.350364 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c"} Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.350377 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8"} Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.350432 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.352822 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.352868 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.352886 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.352940 4851 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9" exitCode=0 Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.353121 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.353156 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9"} Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.354307 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.354327 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.354337 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.355557 4851 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="18608a1fa6b6d967a4d14d4d7bdd4a6a45156e36bd514af9e8680b69256f2add" exitCode=0 Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.355582 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"18608a1fa6b6d967a4d14d4d7bdd4a6a45156e36bd514af9e8680b69256f2add"} Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.355768 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.357193 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.357233 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.357246 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.358945 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.360607 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.360630 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:14 crc kubenswrapper[4851]: I1001 12:53:14.360642 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.264884 4851 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Oct 01 12:53:15 crc kubenswrapper[4851]: E1001 12:53:15.271326 4851 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="3.2s" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.360228 4851 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d23bb696019ddd2cabfc396f5da3be99196be13e721b6ae1a761bf90df632765" exitCode=0 Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.360292 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d23bb696019ddd2cabfc396f5da3be99196be13e721b6ae1a761bf90df632765"} Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.360419 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.361251 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.361271 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.361279 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.364478 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"647cbd9a63725397f8ffbd446cf836c3460abe9ddd35f3b467bb06f911b2c696"} Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.364551 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"efbf0c7573d4d52655a656f8e7b194959cac7d6a16ce8ffe745d9afb60dccb01"} Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.364536 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.364567 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6fafa3446313876d43dd3399cb4bcda6295ffc26df86ef9527278767327c1374"} Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.365370 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.365388 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.365396 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.366945 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"58579541903dde948e229b4bf867706a43dcd3f6a005d845703bbd04842f0c36"} Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.366999 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.367549 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.367565 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.367573 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.370842 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.371191 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.371438 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0ad63bc357af5f649c22eea82a8ed96574a07d3b79209ea41b0af5dbb51bc6e0"} Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.371463 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a"} Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.371473 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3"} Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.371481 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd"} Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.371489 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6"} Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.371951 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.371970 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.371977 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.372369 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.372385 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.372392 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.506636 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.508091 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.508125 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.508134 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:15 crc kubenswrapper[4851]: I1001 12:53:15.508157 4851 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:53:15 crc kubenswrapper[4851]: E1001 12:53:15.508461 4851 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.251:6443: connect: connection refused" node="crc" Oct 01 12:53:15 crc kubenswrapper[4851]: W1001 12:53:15.528067 4851 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Oct 01 12:53:15 crc kubenswrapper[4851]: E1001 12:53:15.528124 4851 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.377472 4851 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f3f2e95d9263d7a363fe56839a56f9499fd18b25ef33d2633cf69f39ed3d94c9" exitCode=0 Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.377658 4851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.377706 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.378481 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.379331 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f3f2e95d9263d7a363fe56839a56f9499fd18b25ef33d2633cf69f39ed3d94c9"} Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.379456 4851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.379489 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.380097 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.381186 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.381225 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.381241 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.382181 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.382393 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.382410 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.383077 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.383107 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.383122 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.384008 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.384055 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:16 crc kubenswrapper[4851]: I1001 12:53:16.384076 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:17 crc kubenswrapper[4851]: I1001 12:53:17.120061 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:17 crc kubenswrapper[4851]: I1001 12:53:17.387794 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"db4999b7eb159819f19cd0ed4827425ca03fb1b60e77e680d94de20d62a63d10"} Oct 01 12:53:17 crc kubenswrapper[4851]: I1001 12:53:17.387864 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3cc043765b4eb1500eb1995653ca94504b3ba1140d55de4e109fbddaa359a0ac"} Oct 01 12:53:17 crc kubenswrapper[4851]: I1001 12:53:17.387885 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"87d2f807e6f28ee67accd968e9bc3a4a27253729a8f9881a4e6047b3f35a2cc8"} Oct 01 12:53:17 crc kubenswrapper[4851]: I1001 12:53:17.387886 4851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:53:17 crc kubenswrapper[4851]: I1001 12:53:17.387953 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:17 crc kubenswrapper[4851]: I1001 12:53:17.389290 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:17 crc kubenswrapper[4851]: I1001 12:53:17.389345 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:17 crc kubenswrapper[4851]: I1001 12:53:17.389363 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.397793 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ab737dd74404443cbd18ead2cc275f4703cb01a13285f28d1cfd83c0b491dd1d"} Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.397850 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"28b04d167e9bba0d4920689cc7a8981acd7ef4cc88083aca77412ad5bd34a29d"} Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.397965 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.399612 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.399667 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.399684 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.624069 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.624289 4851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.624354 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.625877 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.625925 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.625940 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.708732 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.710404 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.710465 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.710483 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:18 crc kubenswrapper[4851]: I1001 12:53:18.710558 4851 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.003741 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.004016 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.005475 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.005522 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.005535 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.342846 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.400533 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.401600 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.401632 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.401644 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.803032 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.803155 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.804038 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.804100 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:19 crc kubenswrapper[4851]: I1001 12:53:19.804120 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:20 crc kubenswrapper[4851]: I1001 12:53:20.403871 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:20 crc kubenswrapper[4851]: I1001 12:53:20.404780 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:20 crc kubenswrapper[4851]: I1001 12:53:20.404818 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:20 crc kubenswrapper[4851]: I1001 12:53:20.404831 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.038213 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.038433 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.039965 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.040023 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.040041 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.047908 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.075323 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.075631 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.077122 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.077162 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.077175 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.409082 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.409269 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:22 crc kubenswrapper[4851]: E1001 12:53:22.409358 4851 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.410036 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.410079 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:22 crc kubenswrapper[4851]: I1001 12:53:22.410092 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:23 crc kubenswrapper[4851]: I1001 12:53:23.410413 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:23 crc kubenswrapper[4851]: I1001 12:53:23.411729 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:23 crc kubenswrapper[4851]: I1001 12:53:23.411793 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:23 crc kubenswrapper[4851]: I1001 12:53:23.411811 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:23 crc kubenswrapper[4851]: I1001 12:53:23.419731 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:24 crc kubenswrapper[4851]: I1001 12:53:24.323266 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:24 crc kubenswrapper[4851]: I1001 12:53:24.413425 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:24 crc kubenswrapper[4851]: I1001 12:53:24.414678 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:24 crc kubenswrapper[4851]: I1001 12:53:24.414727 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:24 crc kubenswrapper[4851]: I1001 12:53:24.414743 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:25 crc kubenswrapper[4851]: I1001 12:53:25.416097 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:25 crc kubenswrapper[4851]: I1001 12:53:25.417475 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:25 crc kubenswrapper[4851]: I1001 12:53:25.417564 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:25 crc kubenswrapper[4851]: I1001 12:53:25.417582 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:26 crc kubenswrapper[4851]: W1001 12:53:26.080077 4851 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 01 12:53:26 crc kubenswrapper[4851]: I1001 12:53:26.080215 4851 trace.go:236] Trace[1077677645]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 12:53:16.078) (total time: 10001ms): Oct 01 12:53:26 crc kubenswrapper[4851]: Trace[1077677645]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:53:26.080) Oct 01 12:53:26 crc kubenswrapper[4851]: Trace[1077677645]: [10.001272139s] [10.001272139s] END Oct 01 12:53:26 crc kubenswrapper[4851]: E1001 12:53:26.080251 4851 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 01 12:53:26 crc kubenswrapper[4851]: I1001 12:53:26.265017 4851 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 01 12:53:26 crc kubenswrapper[4851]: W1001 12:53:26.407688 4851 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 01 12:53:26 crc kubenswrapper[4851]: I1001 12:53:26.407810 4851 trace.go:236] Trace[1007939394]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 12:53:16.406) (total time: 10001ms): Oct 01 12:53:26 crc kubenswrapper[4851]: Trace[1007939394]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:53:26.407) Oct 01 12:53:26 crc kubenswrapper[4851]: Trace[1007939394]: [10.001420513s] [10.001420513s] END Oct 01 12:53:26 crc kubenswrapper[4851]: E1001 12:53:26.407841 4851 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 01 12:53:26 crc kubenswrapper[4851]: I1001 12:53:26.421540 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 12:53:26 crc kubenswrapper[4851]: I1001 12:53:26.426275 4851 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0ad63bc357af5f649c22eea82a8ed96574a07d3b79209ea41b0af5dbb51bc6e0" exitCode=255 Oct 01 12:53:26 crc kubenswrapper[4851]: I1001 12:53:26.426314 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0ad63bc357af5f649c22eea82a8ed96574a07d3b79209ea41b0af5dbb51bc6e0"} Oct 01 12:53:26 crc kubenswrapper[4851]: I1001 12:53:26.426450 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:26 crc kubenswrapper[4851]: I1001 12:53:26.427186 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:26 crc kubenswrapper[4851]: I1001 12:53:26.427243 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:26 crc kubenswrapper[4851]: I1001 12:53:26.427257 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:26 crc kubenswrapper[4851]: I1001 12:53:26.428015 4851 scope.go:117] "RemoveContainer" containerID="0ad63bc357af5f649c22eea82a8ed96574a07d3b79209ea41b0af5dbb51bc6e0" Oct 01 12:53:26 crc kubenswrapper[4851]: W1001 12:53:26.430259 4851 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 01 12:53:26 crc kubenswrapper[4851]: I1001 12:53:26.430343 4851 trace.go:236] Trace[1366262233]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 12:53:16.425) (total time: 10004ms): Oct 01 12:53:26 crc kubenswrapper[4851]: Trace[1366262233]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10004ms (12:53:26.430) Oct 01 12:53:26 crc kubenswrapper[4851]: Trace[1366262233]: [10.004957797s] [10.004957797s] END Oct 01 12:53:26 crc kubenswrapper[4851]: E1001 12:53:26.430366 4851 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.265016 4851 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.265107 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.270149 4851 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.270285 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.323612 4851 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.323686 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.430932 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.433431 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16"} Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.433649 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.434780 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.434812 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.434823 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.956442 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.956697 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.958277 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.958302 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:27 crc kubenswrapper[4851]: I1001 12:53:27.958311 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:28 crc kubenswrapper[4851]: I1001 12:53:28.001322 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 01 12:53:28 crc kubenswrapper[4851]: I1001 12:53:28.435660 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:28 crc kubenswrapper[4851]: I1001 12:53:28.438849 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:28 crc kubenswrapper[4851]: I1001 12:53:28.438881 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:28 crc kubenswrapper[4851]: I1001 12:53:28.438894 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:28 crc kubenswrapper[4851]: I1001 12:53:28.449341 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 01 12:53:28 crc kubenswrapper[4851]: I1001 12:53:28.632373 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:28 crc kubenswrapper[4851]: I1001 12:53:28.632642 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:28 crc kubenswrapper[4851]: I1001 12:53:28.632963 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:28 crc kubenswrapper[4851]: I1001 12:53:28.634101 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:28 crc kubenswrapper[4851]: I1001 12:53:28.634164 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:28 crc kubenswrapper[4851]: I1001 12:53:28.634229 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:28 crc kubenswrapper[4851]: I1001 12:53:28.640329 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:29 crc kubenswrapper[4851]: I1001 12:53:29.438332 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:29 crc kubenswrapper[4851]: I1001 12:53:29.438643 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:29 crc kubenswrapper[4851]: I1001 12:53:29.440859 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:29 crc kubenswrapper[4851]: I1001 12:53:29.440904 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:29 crc kubenswrapper[4851]: I1001 12:53:29.440859 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:29 crc kubenswrapper[4851]: I1001 12:53:29.440947 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:29 crc kubenswrapper[4851]: I1001 12:53:29.440969 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:29 crc kubenswrapper[4851]: I1001 12:53:29.440922 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:30 crc kubenswrapper[4851]: I1001 12:53:30.440642 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:30 crc kubenswrapper[4851]: I1001 12:53:30.442164 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:30 crc kubenswrapper[4851]: I1001 12:53:30.442216 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:30 crc kubenswrapper[4851]: I1001 12:53:30.442229 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:30 crc kubenswrapper[4851]: I1001 12:53:30.813920 4851 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 01 12:53:30 crc kubenswrapper[4851]: I1001 12:53:30.848312 4851 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 01 12:53:30 crc kubenswrapper[4851]: I1001 12:53:30.976326 4851 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.262352 4851 apiserver.go:52] "Watching apiserver" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.267884 4851 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.268384 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.268987 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.269448 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:31 crc kubenswrapper[4851]: E1001 12:53:31.269610 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.269657 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.269003 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.269907 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.269942 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:31 crc kubenswrapper[4851]: E1001 12:53:31.270005 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:31 crc kubenswrapper[4851]: E1001 12:53:31.270171 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.274176 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.274645 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.274753 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.275060 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.275172 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.275208 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.275421 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.276402 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.276825 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.322647 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.344166 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.359864 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.366122 4851 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.377356 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.392402 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.413613 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.428055 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:53:31 crc kubenswrapper[4851]: I1001 12:53:31.443337 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.256130 4851 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.257998 4851 trace.go:236] Trace[325435770]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 12:53:20.327) (total time: 11930ms): Oct 01 12:53:32 crc kubenswrapper[4851]: Trace[325435770]: ---"Objects listed" error: 11930ms (12:53:32.257) Oct 01 12:53:32 crc kubenswrapper[4851]: Trace[325435770]: [11.930154106s] [11.930154106s] END Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.258313 4851 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.259879 4851 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.260253 4851 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.341024 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.357714 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361165 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361231 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361268 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361298 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361330 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361360 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361394 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361425 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361459 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361490 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361553 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361585 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361615 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361646 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361745 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361777 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361827 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361861 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361893 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361928 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361960 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.361990 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362061 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362063 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362092 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362063 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362124 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362166 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362197 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362230 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362301 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362332 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362362 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362392 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362423 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362453 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362482 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362537 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362572 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362602 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362633 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362664 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362696 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362685 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362727 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362760 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362767 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362775 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362796 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362805 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362821 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362828 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362868 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362887 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.362931 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363051 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363127 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363168 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363205 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363238 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363273 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363242 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363306 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363340 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363375 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363408 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363441 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363478 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363537 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363569 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363600 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363633 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363665 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363701 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363733 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363776 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363810 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363841 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363871 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363902 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363934 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363965 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363999 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364032 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364062 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364095 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364129 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364161 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364194 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364224 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364260 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364341 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364374 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364406 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364440 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364471 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364527 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364561 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364618 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364668 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364701 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364734 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364784 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364818 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364850 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364883 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364916 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364947 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364979 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365012 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365044 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365087 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365120 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365154 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365187 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365226 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365260 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365295 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365327 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365361 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365396 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365427 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365459 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365492 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365549 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365582 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365616 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365656 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365688 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365723 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365759 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365987 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366019 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366051 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366084 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366120 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366156 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366191 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366225 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366267 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366303 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366335 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366367 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366400 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366436 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366469 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366554 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366597 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366633 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366668 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366707 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366743 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366777 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367038 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367075 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367110 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367143 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367178 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367217 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367248 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367281 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367315 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367349 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367383 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367418 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367558 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367593 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367629 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367664 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367702 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367857 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367893 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367934 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367970 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.368003 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.368035 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.368068 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.368102 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.368139 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.368234 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.368271 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.368307 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.369711 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363285 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.369764 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363318 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363353 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363624 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363666 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363659 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.363786 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364035 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364252 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364269 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364305 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.364519 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370087 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370129 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370167 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370203 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370239 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370275 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370311 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370348 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370386 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370422 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370539 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370582 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370619 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370659 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370694 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370731 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370767 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371037 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371076 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371112 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371145 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371180 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371249 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371292 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371408 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371456 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371494 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371570 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371613 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371648 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371683 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371721 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371758 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371798 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371834 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371873 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371952 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371974 4851 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371995 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372019 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372039 4851 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372059 4851 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372080 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372100 4851 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372120 4851 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372140 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372161 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372181 4851 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372202 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372225 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372247 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372267 4851 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372288 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.376676 4851 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.376711 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.376729 4851 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.376742 4851 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.376756 4851 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.379455 4851 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.388384 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.397591 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.397633 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365117 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365176 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365261 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365650 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.365692 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366407 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366449 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366519 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366580 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.366678 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367029 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367619 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367704 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.367914 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.368174 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.368738 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.368859 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.369213 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.369418 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.369573 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.369733 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370113 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370125 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370361 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370377 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370509 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370587 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370664 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370730 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.370892 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371271 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371300 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371558 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.371799 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372022 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372232 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372263 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.372526 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.373003 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.373364 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.373370 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.373710 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.373897 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.374055 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.374057 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.374142 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.374171 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.374433 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.374550 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.374697 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.374928 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.376975 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.377324 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.377354 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.377442 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.377797 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.378046 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.378323 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.379200 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.379244 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.380969 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.381036 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.381345 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.381381 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.382416 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.383112 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.383654 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.383823 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.384078 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.384317 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.384462 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.385056 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.385078 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.385171 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.385338 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.385361 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.386157 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.386898 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.386906 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.386919 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.387129 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.387153 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.387040 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.387566 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.387681 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.387727 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.387725 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.387784 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.388121 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.388080 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.388157 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.388155 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.388292 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.388338 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.388452 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.388589 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.388897 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.388969 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.389288 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.389323 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.389415 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.389543 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.390100 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.390088 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.390111 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.390127 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.391806 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.395884 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.398735 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.399110 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.400895 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.405875 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.405887 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.400682 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.406393 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.406779 4851 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.407248 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.407372 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.407751 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.408383 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.408421 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.409147 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.409181 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.409323 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.409931 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.410297 4851 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.410310 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.410356 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.410358 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.410796 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.410826 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.411161 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.411717 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.413344 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.413558 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.413968 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.414107 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.414144 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.414183 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.414551 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.414743 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.415121 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.415152 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.415242 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.415272 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.416320 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.416320 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.416353 4851 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.415330 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.415560 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.415580 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.415865 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.416696 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.416796 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.417283 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.416956 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.417428 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.417896 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.417551 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.420617 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:32.92056624 +0000 UTC m=+21.265683776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.420680 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:53:32.920660882 +0000 UTC m=+21.265778598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.420782 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:32.920767115 +0000 UTC m=+21.265884791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.420808 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:32.920797666 +0000 UTC m=+21.265915382 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.427247 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.428932 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.428963 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.428980 4851 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.429048 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:32.929027566 +0000 UTC m=+21.274145242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.429852 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.429985 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.430251 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.430615 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.431487 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.432684 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.433641 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.433903 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.433914 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.434190 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.433957 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.434451 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.435412 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.436871 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.437494 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.439611 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.441786 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.442581 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.446842 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.449997 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.450847 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.452619 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.459240 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.469070 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.474436 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478046 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478226 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478257 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478319 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478348 4851 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478398 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478413 4851 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478428 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478440 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478453 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478466 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478484 4851 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478512 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478528 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478543 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478576 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478589 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478602 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478614 4851 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478629 4851 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478641 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478656 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478667 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478679 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478719 4851 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478730 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478742 4851 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478752 4851 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478760 4851 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478769 4851 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478778 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478787 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478799 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478810 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478821 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478833 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478844 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478857 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478869 4851 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478880 4851 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478891 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478903 4851 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478915 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478927 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478938 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478949 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478960 4851 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478971 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478982 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.478994 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479006 4851 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479019 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479030 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479042 4851 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479055 4851 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479066 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479077 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479087 4851 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479098 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479109 4851 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479129 4851 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479140 4851 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479151 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479162 4851 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479173 4851 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479185 4851 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479194 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479205 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479215 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479225 4851 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479236 4851 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479246 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479256 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479266 4851 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479278 4851 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479289 4851 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479300 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479310 4851 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479319 4851 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479330 4851 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479341 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479351 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479360 4851 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479370 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479381 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479391 4851 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479401 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479411 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479423 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479433 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479444 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479457 4851 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479468 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479479 4851 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479491 4851 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479523 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479534 4851 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479545 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479569 4851 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479581 4851 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479597 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479610 4851 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479621 4851 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479632 4851 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479643 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479656 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479666 4851 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479689 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479701 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479715 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479727 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479738 4851 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479750 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479761 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479775 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479788 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479800 4851 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479816 4851 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479828 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479841 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479854 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479866 4851 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479880 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479892 4851 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479904 4851 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479917 4851 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479930 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479944 4851 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479956 4851 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479968 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479981 4851 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.479993 4851 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480004 4851 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480016 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480029 4851 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480042 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480055 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480069 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480081 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480093 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480104 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480115 4851 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480127 4851 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480138 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480151 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480163 4851 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480176 4851 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480188 4851 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480200 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480211 4851 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480224 4851 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480235 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480248 4851 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480259 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480271 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480283 4851 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480295 4851 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480307 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480319 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480331 4851 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480344 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480357 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480370 4851 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480382 4851 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480394 4851 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480405 4851 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480417 4851 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480428 4851 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480441 4851 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480453 4851 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480464 4851 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480477 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480491 4851 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480557 4851 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480570 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.480581 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.490080 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:53:32 crc kubenswrapper[4851]: W1001 12:53:32.504447 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-8c6fc32fb6112c75d0ab6cfe313ce76ea7e9415c5202248941fcfa556295f5b0 WatchSource:0}: Error finding container 8c6fc32fb6112c75d0ab6cfe313ce76ea7e9415c5202248941fcfa556295f5b0: Status 404 returned error can't find the container with id 8c6fc32fb6112c75d0ab6cfe313ce76ea7e9415c5202248941fcfa556295f5b0 Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.507695 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.515999 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:53:32 crc kubenswrapper[4851]: W1001 12:53:32.539007 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-5e63e8f1a0ec640c7a57f39ab5cb4025462d791726d3543a8a73879e2c10b7b4 WatchSource:0}: Error finding container 5e63e8f1a0ec640c7a57f39ab5cb4025462d791726d3543a8a73879e2c10b7b4: Status 404 returned error can't find the container with id 5e63e8f1a0ec640c7a57f39ab5cb4025462d791726d3543a8a73879e2c10b7b4 Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.988424 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.988604 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.988618 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:53:33.988592692 +0000 UTC m=+22.333710188 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.988672 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.988711 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.988740 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: I1001 12:53:32.988766 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.988799 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.988834 4851 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.988894 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:33.988879949 +0000 UTC m=+22.333997445 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.988911 4851 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.988972 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:33.988951471 +0000 UTC m=+22.334068997 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.989009 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.989082 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.989103 4851 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.989187 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:33.989161787 +0000 UTC m=+22.334279313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.988970 4851 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:53:32 crc kubenswrapper[4851]: E1001 12:53:32.989314 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:33.989295911 +0000 UTC m=+22.334413437 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.327703 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.327876 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.327964 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.328088 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.328325 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.328638 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.462990 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5"} Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.463034 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a"} Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.463047 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ca8b85be153ee539712c90e0b806eda3874f9fec06b6036ba6e2e6c7c4332a01"} Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.464556 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5e63e8f1a0ec640c7a57f39ab5cb4025462d791726d3543a8a73879e2c10b7b4"} Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.466195 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0"} Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.466222 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8c6fc32fb6112c75d0ab6cfe313ce76ea7e9415c5202248941fcfa556295f5b0"} Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.468588 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.469542 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.472350 4851 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16" exitCode=255 Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.472449 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16"} Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.472566 4851 scope.go:117] "RemoveContainer" containerID="0ad63bc357af5f649c22eea82a8ed96574a07d3b79209ea41b0af5dbb51bc6e0" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.481366 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.493179 4851 scope.go:117] "RemoveContainer" containerID="4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16" Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.493431 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.493493 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.509639 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.534108 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.555882 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.573163 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.586510 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.616033 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.640472 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.657533 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.684866 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.714560 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.734434 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ad63bc357af5f649c22eea82a8ed96574a07d3b79209ea41b0af5dbb51bc6e0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:26Z\\\",\\\"message\\\":\\\"W1001 12:53:15.457305 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 12:53:15.457697 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759323195 cert, and key in /tmp/serving-cert-3436970698/serving-signer.crt, /tmp/serving-cert-3436970698/serving-signer.key\\\\nI1001 12:53:15.750204 1 observer_polling.go:159] Starting file observer\\\\nW1001 12:53:15.754048 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 12:53:15.754191 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:15.756145 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3436970698/tls.crt::/tmp/serving-cert-3436970698/tls.key\\\\\\\"\\\\nF1001 12:53:26.028870 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.746974 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.997122 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.997182 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.997202 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.997221 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:33 crc kubenswrapper[4851]: I1001 12:53:33.997240 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.997338 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.997351 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.997361 4851 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.997404 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:35.997388875 +0000 UTC m=+24.342506361 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.997560 4851 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.997597 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:35.997587391 +0000 UTC m=+24.342704877 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.997681 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.997693 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.997702 4851 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.997728 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:35.997721324 +0000 UTC m=+24.342838810 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.997816 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:53:35.997806627 +0000 UTC m=+24.342924113 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.997837 4851 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:53:33 crc kubenswrapper[4851]: E1001 12:53:33.997966 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:35.99794097 +0000 UTC m=+24.343058456 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.099601 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5q84l"] Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.100057 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5q84l" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.102733 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.102776 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.103804 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.107650 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.118337 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.137098 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ad63bc357af5f649c22eea82a8ed96574a07d3b79209ea41b0af5dbb51bc6e0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:26Z\\\",\\\"message\\\":\\\"W1001 12:53:15.457305 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 12:53:15.457697 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759323195 cert, and key in /tmp/serving-cert-3436970698/serving-signer.crt, /tmp/serving-cert-3436970698/serving-signer.key\\\\nI1001 12:53:15.750204 1 observer_polling.go:159] Starting file observer\\\\nW1001 12:53:15.754048 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 12:53:15.754191 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:15.756145 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3436970698/tls.crt::/tmp/serving-cert-3436970698/tls.key\\\\\\\"\\\\nF1001 12:53:26.028870 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.156790 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.170710 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.183779 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.199051 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q78w\" (UniqueName: \"kubernetes.io/projected/b9b653bb-2bc1-4a97-8699-326054117be2-kube-api-access-7q78w\") pod \"node-resolver-5q84l\" (UID: \"b9b653bb-2bc1-4a97-8699-326054117be2\") " pod="openshift-dns/node-resolver-5q84l" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.199129 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b9b653bb-2bc1-4a97-8699-326054117be2-hosts-file\") pod \"node-resolver-5q84l\" (UID: \"b9b653bb-2bc1-4a97-8699-326054117be2\") " pod="openshift-dns/node-resolver-5q84l" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.213433 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.242590 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.261657 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.300594 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b9b653bb-2bc1-4a97-8699-326054117be2-hosts-file\") pod \"node-resolver-5q84l\" (UID: \"b9b653bb-2bc1-4a97-8699-326054117be2\") " pod="openshift-dns/node-resolver-5q84l" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.300661 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q78w\" (UniqueName: \"kubernetes.io/projected/b9b653bb-2bc1-4a97-8699-326054117be2-kube-api-access-7q78w\") pod \"node-resolver-5q84l\" (UID: \"b9b653bb-2bc1-4a97-8699-326054117be2\") " pod="openshift-dns/node-resolver-5q84l" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.300708 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b9b653bb-2bc1-4a97-8699-326054117be2-hosts-file\") pod \"node-resolver-5q84l\" (UID: \"b9b653bb-2bc1-4a97-8699-326054117be2\") " pod="openshift-dns/node-resolver-5q84l" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.326529 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q78w\" (UniqueName: \"kubernetes.io/projected/b9b653bb-2bc1-4a97-8699-326054117be2-kube-api-access-7q78w\") pod \"node-resolver-5q84l\" (UID: \"b9b653bb-2bc1-4a97-8699-326054117be2\") " pod="openshift-dns/node-resolver-5q84l" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.333738 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.334249 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.335132 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.335767 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.336310 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.336794 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.337350 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.337877 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.338490 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.338988 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.339487 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.340357 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.340835 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.341335 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.341860 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.342351 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.342942 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.343354 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.346776 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.347325 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.348477 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.350057 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.350472 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.351464 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.351895 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.353127 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.353814 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.354652 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.355214 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.356031 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.356515 4851 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.356615 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.358635 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.359165 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.359589 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.361061 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.361983 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.362487 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.363446 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.364081 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.364899 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.365455 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.366450 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.367064 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.367875 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.368376 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.369245 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.369950 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.370797 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.371232 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.372097 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.372691 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.373223 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.374030 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.374655 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.380509 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.383836 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.400959 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.411424 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5q84l" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.416993 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.418819 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.450921 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.464811 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.477737 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.479944 4851 scope.go:117] "RemoveContainer" containerID="4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16" Oct 01 12:53:34 crc kubenswrapper[4851]: E1001 12:53:34.480088 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.480558 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5q84l" event={"ID":"b9b653bb-2bc1-4a97-8699-326054117be2","Type":"ContainerStarted","Data":"de83a2fc921ba38678da20926cd1a90fdcb62aa058418153d690aa6fccac1488"} Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.482493 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.501282 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.522390 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ad63bc357af5f649c22eea82a8ed96574a07d3b79209ea41b0af5dbb51bc6e0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:26Z\\\",\\\"message\\\":\\\"W1001 12:53:15.457305 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 12:53:15.457697 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759323195 cert, and key in /tmp/serving-cert-3436970698/serving-signer.crt, /tmp/serving-cert-3436970698/serving-signer.key\\\\nI1001 12:53:15.750204 1 observer_polling.go:159] Starting file observer\\\\nW1001 12:53:15.754048 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 12:53:15.754191 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:15.756145 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3436970698/tls.crt::/tmp/serving-cert-3436970698/tls.key\\\\\\\"\\\\nF1001 12:53:26.028870 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.525181 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-brwh5"] Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.525841 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-fv72m"] Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.526013 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.526106 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.528731 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.528947 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.529000 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.529027 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.529148 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.529181 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.529209 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.529849 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.538353 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.546550 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.550627 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.567809 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.579040 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.589216 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.603385 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.603524 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e98556a-d5e5-44d6-ad39-d13303fc263c-cni-binary-copy\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.603556 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3acff5c-c60b-4f54-acfa-5521ded8b2af-proxy-tls\") pod \"machine-config-daemon-fv72m\" (UID: \"f3acff5c-c60b-4f54-acfa-5521ded8b2af\") " pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.603576 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r4dg\" (UniqueName: \"kubernetes.io/projected/f3acff5c-c60b-4f54-acfa-5521ded8b2af-kube-api-access-2r4dg\") pod \"machine-config-daemon-fv72m\" (UID: \"f3acff5c-c60b-4f54-acfa-5521ded8b2af\") " pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.603608 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e98556a-d5e5-44d6-ad39-d13303fc263c-os-release\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.603628 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmv5t\" (UniqueName: \"kubernetes.io/projected/3e98556a-d5e5-44d6-ad39-d13303fc263c-kube-api-access-nmv5t\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.603664 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e98556a-d5e5-44d6-ad39-d13303fc263c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.603685 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e98556a-d5e5-44d6-ad39-d13303fc263c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.603787 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3acff5c-c60b-4f54-acfa-5521ded8b2af-mcd-auth-proxy-config\") pod \"machine-config-daemon-fv72m\" (UID: \"f3acff5c-c60b-4f54-acfa-5521ded8b2af\") " pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.603860 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e98556a-d5e5-44d6-ad39-d13303fc263c-system-cni-dir\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.603886 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f3acff5c-c60b-4f54-acfa-5521ded8b2af-rootfs\") pod \"machine-config-daemon-fv72m\" (UID: \"f3acff5c-c60b-4f54-acfa-5521ded8b2af\") " pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.603935 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e98556a-d5e5-44d6-ad39-d13303fc263c-cnibin\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.620616 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.633543 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.646040 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.660290 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.678880 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.705129 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e98556a-d5e5-44d6-ad39-d13303fc263c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.705183 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e98556a-d5e5-44d6-ad39-d13303fc263c-system-cni-dir\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.705205 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f3acff5c-c60b-4f54-acfa-5521ded8b2af-rootfs\") pod \"machine-config-daemon-fv72m\" (UID: \"f3acff5c-c60b-4f54-acfa-5521ded8b2af\") " pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.705224 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3acff5c-c60b-4f54-acfa-5521ded8b2af-mcd-auth-proxy-config\") pod \"machine-config-daemon-fv72m\" (UID: \"f3acff5c-c60b-4f54-acfa-5521ded8b2af\") " pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.705246 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e98556a-d5e5-44d6-ad39-d13303fc263c-cnibin\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.705280 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e98556a-d5e5-44d6-ad39-d13303fc263c-cni-binary-copy\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.705305 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e98556a-d5e5-44d6-ad39-d13303fc263c-os-release\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.705319 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f3acff5c-c60b-4f54-acfa-5521ded8b2af-rootfs\") pod \"machine-config-daemon-fv72m\" (UID: \"f3acff5c-c60b-4f54-acfa-5521ded8b2af\") " pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.705328 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmv5t\" (UniqueName: \"kubernetes.io/projected/3e98556a-d5e5-44d6-ad39-d13303fc263c-kube-api-access-nmv5t\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.705383 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3acff5c-c60b-4f54-acfa-5521ded8b2af-proxy-tls\") pod \"machine-config-daemon-fv72m\" (UID: \"f3acff5c-c60b-4f54-acfa-5521ded8b2af\") " pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.705409 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r4dg\" (UniqueName: \"kubernetes.io/projected/f3acff5c-c60b-4f54-acfa-5521ded8b2af-kube-api-access-2r4dg\") pod \"machine-config-daemon-fv72m\" (UID: \"f3acff5c-c60b-4f54-acfa-5521ded8b2af\") " pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.705458 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e98556a-d5e5-44d6-ad39-d13303fc263c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.705602 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e98556a-d5e5-44d6-ad39-d13303fc263c-os-release\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.705703 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e98556a-d5e5-44d6-ad39-d13303fc263c-cnibin\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.706049 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e98556a-d5e5-44d6-ad39-d13303fc263c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.706515 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3acff5c-c60b-4f54-acfa-5521ded8b2af-mcd-auth-proxy-config\") pod \"machine-config-daemon-fv72m\" (UID: \"f3acff5c-c60b-4f54-acfa-5521ded8b2af\") " pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.706619 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e98556a-d5e5-44d6-ad39-d13303fc263c-system-cni-dir\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.706777 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e98556a-d5e5-44d6-ad39-d13303fc263c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.707275 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e98556a-d5e5-44d6-ad39-d13303fc263c-cni-binary-copy\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.714325 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3acff5c-c60b-4f54-acfa-5521ded8b2af-proxy-tls\") pod \"machine-config-daemon-fv72m\" (UID: \"f3acff5c-c60b-4f54-acfa-5521ded8b2af\") " pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.714627 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.728429 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r4dg\" (UniqueName: \"kubernetes.io/projected/f3acff5c-c60b-4f54-acfa-5521ded8b2af-kube-api-access-2r4dg\") pod \"machine-config-daemon-fv72m\" (UID: \"f3acff5c-c60b-4f54-acfa-5521ded8b2af\") " pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.732464 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmv5t\" (UniqueName: \"kubernetes.io/projected/3e98556a-d5e5-44d6-ad39-d13303fc263c-kube-api-access-nmv5t\") pod \"multus-additional-cni-plugins-brwh5\" (UID: \"3e98556a-d5e5-44d6-ad39-d13303fc263c\") " pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.741073 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.753256 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.764906 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.779742 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.792636 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.810348 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.824212 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.838639 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-brwh5" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.839627 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.842637 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.853415 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: W1001 12:53:34.855706 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3acff5c_c60b_4f54_acfa_5521ded8b2af.slice/crio-cc8b5bf6738189ab9f9ebb775f82640d8cd5a22d1514e2587cb0b0ba73b25fd5 WatchSource:0}: Error finding container cc8b5bf6738189ab9f9ebb775f82640d8cd5a22d1514e2587cb0b0ba73b25fd5: Status 404 returned error can't find the container with id cc8b5bf6738189ab9f9ebb775f82640d8cd5a22d1514e2587cb0b0ba73b25fd5 Oct 01 12:53:34 crc kubenswrapper[4851]: W1001 12:53:34.857725 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e98556a_d5e5_44d6_ad39_d13303fc263c.slice/crio-df3f48aac45ba908703ea09740ccd5ac05f43d78856dd23d251906659a47bd8c WatchSource:0}: Error finding container df3f48aac45ba908703ea09740ccd5ac05f43d78856dd23d251906659a47bd8c: Status 404 returned error can't find the container with id df3f48aac45ba908703ea09740ccd5ac05f43d78856dd23d251906659a47bd8c Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.915486 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-s78wn"] Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.916698 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.919850 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-t5vvf"] Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.920041 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.920287 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t5vvf" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.920293 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.920673 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.920753 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.920906 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.921118 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.921325 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.921491 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.925328 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.941763 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.959473 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.976565 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:34 crc kubenswrapper[4851]: I1001 12:53:34.994294 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007160 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-multus-cni-dir\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007209 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-var-lib-cni-multus\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007237 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-env-overrides\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007265 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-kubelet\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007356 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-log-socket\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007411 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-run-netns\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007463 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovn-node-metrics-cert\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007490 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-var-lib-kubelet\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007541 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-var-lib-openvswitch\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007569 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-multus-socket-dir-parent\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007591 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-run-k8s-cni-cncf-io\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007618 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-var-lib-cni-bin\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007661 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-run-multus-certs\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007685 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-systemd-units\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007730 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f68f162a-4e04-41d2-8197-95bac24aad23-cni-binary-copy\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007752 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-node-log\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007776 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007799 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007824 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovnkube-config\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007854 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-etc-kubernetes\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007882 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-hostroot\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007903 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-systemd\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.007924 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-cni-bin\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.008007 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-openvswitch\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.008040 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-os-release\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.008064 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f68f162a-4e04-41d2-8197-95bac24aad23-multus-daemon-config\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.008110 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-run-netns\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.008153 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-cnibin\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.008192 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-ovn\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.008230 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-system-cni-dir\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.008248 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-etc-openvswitch\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.008264 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9jss\" (UniqueName: \"kubernetes.io/projected/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-kube-api-access-l9jss\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.008304 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-multus-conf-dir\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.008320 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p4f7\" (UniqueName: \"kubernetes.io/projected/f68f162a-4e04-41d2-8197-95bac24aad23-kube-api-access-7p4f7\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.008336 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-slash\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.008351 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-cni-netd\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.008371 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovnkube-script-lib\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.014929 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.031111 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.048103 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.063461 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.082184 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.096998 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109593 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-var-lib-cni-bin\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109637 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-run-multus-certs\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109671 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-systemd-units\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109705 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-multus-socket-dir-parent\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109725 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-run-k8s-cni-cncf-io\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109742 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-var-lib-cni-bin\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109753 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109808 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109828 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f68f162a-4e04-41d2-8197-95bac24aad23-cni-binary-copy\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109851 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-node-log\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109864 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-run-multus-certs\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109872 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109892 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovnkube-config\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109900 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-systemd-units\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109921 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-etc-kubernetes\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109942 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-hostroot\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109958 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-systemd\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.109973 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-cni-bin\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110011 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-openvswitch\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110034 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-os-release\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110053 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f68f162a-4e04-41d2-8197-95bac24aad23-multus-daemon-config\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110071 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-run-netns\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110094 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-cnibin\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110096 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-multus-socket-dir-parent\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110111 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-ovn\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110131 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-system-cni-dir\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110143 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-systemd\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110160 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-run-k8s-cni-cncf-io\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110169 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-etc-openvswitch\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110253 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-hostroot\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110270 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-cnibin\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110297 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-cni-bin\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110303 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9jss\" (UniqueName: \"kubernetes.io/projected/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-kube-api-access-l9jss\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110315 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-node-log\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110328 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p4f7\" (UniqueName: \"kubernetes.io/projected/f68f162a-4e04-41d2-8197-95bac24aad23-kube-api-access-7p4f7\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110334 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-etc-openvswitch\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110356 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-etc-kubernetes\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110352 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-slash\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110381 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-slash\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110395 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-cni-netd\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110404 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.110921 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-run-netns\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111011 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f68f162a-4e04-41d2-8197-95bac24aad23-cni-binary-copy\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111158 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-system-cni-dir\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111176 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovnkube-config\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111199 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-ovn\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111183 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-openvswitch\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111250 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-cni-netd\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111302 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-multus-conf-dir\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111338 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovnkube-script-lib\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111381 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-multus-conf-dir\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111427 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-os-release\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111536 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-multus-cni-dir\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111682 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f68f162a-4e04-41d2-8197-95bac24aad23-multus-daemon-config\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111734 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-multus-cni-dir\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111737 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-var-lib-cni-multus\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111784 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-var-lib-cni-multus\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111817 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-env-overrides\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111925 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovnkube-script-lib\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.111996 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-kubelet\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.112028 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-log-socket\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.112052 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-kubelet\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.112073 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-log-socket\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.112116 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-run-netns\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.112146 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovn-node-metrics-cert\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.112179 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-run-netns\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.112273 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-env-overrides\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.112370 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-var-lib-kubelet\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.112423 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-var-lib-openvswitch\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.112487 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f68f162a-4e04-41d2-8197-95bac24aad23-host-var-lib-kubelet\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.112539 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-var-lib-openvswitch\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.116292 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovn-node-metrics-cert\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.126232 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.130184 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p4f7\" (UniqueName: \"kubernetes.io/projected/f68f162a-4e04-41d2-8197-95bac24aad23-kube-api-access-7p4f7\") pod \"multus-t5vvf\" (UID: \"f68f162a-4e04-41d2-8197-95bac24aad23\") " pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.137184 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9jss\" (UniqueName: \"kubernetes.io/projected/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-kube-api-access-l9jss\") pod \"ovnkube-node-s78wn\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.139007 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.160685 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.174646 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.193251 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.209151 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.223796 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.235265 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.236343 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.246451 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t5vvf" Oct 01 12:53:35 crc kubenswrapper[4851]: W1001 12:53:35.248938 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeff1f44a_a0e9_4b4f_9d74_ab1c2bf4b1c9.slice/crio-36ab8cd424e79f64cdab995ff2d93a564be1178f4ef7cf4a4bf7d52cac2b75da WatchSource:0}: Error finding container 36ab8cd424e79f64cdab995ff2d93a564be1178f4ef7cf4a4bf7d52cac2b75da: Status 404 returned error can't find the container with id 36ab8cd424e79f64cdab995ff2d93a564be1178f4ef7cf4a4bf7d52cac2b75da Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.260055 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.279383 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.296121 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.322770 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.327844 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:35 crc kubenswrapper[4851]: E1001 12:53:35.327983 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.328081 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.328182 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:35 crc kubenswrapper[4851]: E1001 12:53:35.328333 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:35 crc kubenswrapper[4851]: E1001 12:53:35.328442 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.375531 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.404203 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.425216 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.493354 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5q84l" event={"ID":"b9b653bb-2bc1-4a97-8699-326054117be2","Type":"ContainerStarted","Data":"c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124"} Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.495222 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0"} Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.496652 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb"} Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.496672 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6"} Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.496681 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"cc8b5bf6738189ab9f9ebb775f82640d8cd5a22d1514e2587cb0b0ba73b25fd5"} Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.498298 4851 generic.go:334] "Generic (PLEG): container finished" podID="3e98556a-d5e5-44d6-ad39-d13303fc263c" containerID="feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf" exitCode=0 Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.498393 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" event={"ID":"3e98556a-d5e5-44d6-ad39-d13303fc263c","Type":"ContainerDied","Data":"feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf"} Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.498424 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" event={"ID":"3e98556a-d5e5-44d6-ad39-d13303fc263c","Type":"ContainerStarted","Data":"df3f48aac45ba908703ea09740ccd5ac05f43d78856dd23d251906659a47bd8c"} Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.499732 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t5vvf" event={"ID":"f68f162a-4e04-41d2-8197-95bac24aad23","Type":"ContainerStarted","Data":"929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965"} Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.499774 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t5vvf" event={"ID":"f68f162a-4e04-41d2-8197-95bac24aad23","Type":"ContainerStarted","Data":"eb9141566c0904a3e448f0ec1e6a47014f7d214ffc425c7845642e47588f3e70"} Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.500930 4851 generic.go:334] "Generic (PLEG): container finished" podID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerID="d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f" exitCode=0 Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.500991 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerDied","Data":"d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f"} Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.501015 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerStarted","Data":"36ab8cd424e79f64cdab995ff2d93a564be1178f4ef7cf4a4bf7d52cac2b75da"} Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.501750 4851 scope.go:117] "RemoveContainer" containerID="4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16" Oct 01 12:53:35 crc kubenswrapper[4851]: E1001 12:53:35.501932 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.506063 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.519888 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.542551 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.555777 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.573408 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.597419 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.623895 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.635767 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.653464 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.666239 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.681584 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.700091 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.712490 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.731819 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.747600 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.761426 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.797357 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.831874 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.873621 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.916250 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.951759 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:35 crc kubenswrapper[4851]: I1001 12:53:35.991330 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.021251 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:53:36 crc kubenswrapper[4851]: E1001 12:53:36.021515 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:53:40.021455245 +0000 UTC m=+28.366572871 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.021620 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.021746 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:36 crc kubenswrapper[4851]: E1001 12:53:36.021887 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:53:36 crc kubenswrapper[4851]: E1001 12:53:36.021907 4851 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:53:36 crc kubenswrapper[4851]: E1001 12:53:36.021934 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:53:36 crc kubenswrapper[4851]: E1001 12:53:36.021974 4851 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:36 crc kubenswrapper[4851]: E1001 12:53:36.021987 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:40.021975979 +0000 UTC m=+28.367093465 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:53:36 crc kubenswrapper[4851]: E1001 12:53:36.022064 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:40.02203588 +0000 UTC m=+28.367153526 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.022400 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:36 crc kubenswrapper[4851]: E1001 12:53:36.022555 4851 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:53:36 crc kubenswrapper[4851]: E1001 12:53:36.022607 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:40.022596615 +0000 UTC m=+28.367714291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.022695 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:36 crc kubenswrapper[4851]: E1001 12:53:36.022883 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:53:36 crc kubenswrapper[4851]: E1001 12:53:36.022910 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:53:36 crc kubenswrapper[4851]: E1001 12:53:36.022927 4851 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:36 crc kubenswrapper[4851]: E1001 12:53:36.022971 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:40.022961525 +0000 UTC m=+28.368079011 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.030799 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.072514 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.110213 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.147571 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.507603 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerStarted","Data":"0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f"} Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.507680 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerStarted","Data":"6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124"} Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.507696 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerStarted","Data":"e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a"} Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.507709 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerStarted","Data":"96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b"} Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.510832 4851 generic.go:334] "Generic (PLEG): container finished" podID="3e98556a-d5e5-44d6-ad39-d13303fc263c" containerID="b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d" exitCode=0 Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.510911 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" event={"ID":"3e98556a-d5e5-44d6-ad39-d13303fc263c","Type":"ContainerDied","Data":"b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d"} Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.530887 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.550284 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.566553 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.589019 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.608416 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.636461 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.664248 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.686941 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.717294 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.735692 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.756309 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.773803 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:36 crc kubenswrapper[4851]: I1001 12:53:36.789780 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.327674 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.327714 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:37 crc kubenswrapper[4851]: E1001 12:53:37.328457 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:37 crc kubenswrapper[4851]: E1001 12:53:37.328596 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.327736 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:37 crc kubenswrapper[4851]: E1001 12:53:37.328694 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.520280 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerStarted","Data":"f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9"} Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.520353 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerStarted","Data":"d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5"} Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.522880 4851 generic.go:334] "Generic (PLEG): container finished" podID="3e98556a-d5e5-44d6-ad39-d13303fc263c" containerID="c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1" exitCode=0 Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.522948 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" event={"ID":"3e98556a-d5e5-44d6-ad39-d13303fc263c","Type":"ContainerDied","Data":"c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1"} Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.548668 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.562909 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.584062 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.596091 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.615668 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.630883 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.648695 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.665312 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.679719 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.692166 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.705979 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.720523 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:37 crc kubenswrapper[4851]: I1001 12:53:37.735633 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.528071 4851 generic.go:334] "Generic (PLEG): container finished" podID="3e98556a-d5e5-44d6-ad39-d13303fc263c" containerID="3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce" exitCode=0 Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.528112 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" event={"ID":"3e98556a-d5e5-44d6-ad39-d13303fc263c","Type":"ContainerDied","Data":"3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce"} Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.545090 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.559388 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.570452 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.583804 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.597791 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.618562 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.652732 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.660646 4851 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.665855 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.665908 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.665926 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.666036 4851 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.696015 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.699830 4851 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.700119 4851 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.708710 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.708739 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.708750 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.708766 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.708778 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:38Z","lastTransitionTime":"2025-10-01T12:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.718255 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: E1001 12:53:38.726420 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.728958 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.730440 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.730569 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.730638 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.730711 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.730770 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:38Z","lastTransitionTime":"2025-10-01T12:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:38 crc kubenswrapper[4851]: E1001 12:53:38.741669 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.744775 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.748133 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.748234 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.748323 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.748406 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.748475 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:38Z","lastTransitionTime":"2025-10-01T12:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.760059 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: E1001 12:53:38.761880 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.764801 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.764898 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.764975 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.765040 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.765101 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:38Z","lastTransitionTime":"2025-10-01T12:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:38 crc kubenswrapper[4851]: E1001 12:53:38.775120 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.775252 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.779815 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.779859 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.779870 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.779887 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.779897 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:38Z","lastTransitionTime":"2025-10-01T12:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:38 crc kubenswrapper[4851]: E1001 12:53:38.791967 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:38 crc kubenswrapper[4851]: E1001 12:53:38.792076 4851 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.793698 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.793726 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.793734 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.793762 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.793772 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:38Z","lastTransitionTime":"2025-10-01T12:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.896212 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.896255 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.896267 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.896284 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.896297 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:38Z","lastTransitionTime":"2025-10-01T12:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.998953 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.999014 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.999032 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.999059 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:38 crc kubenswrapper[4851]: I1001 12:53:38.999079 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:38Z","lastTransitionTime":"2025-10-01T12:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.095304 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-rc8gl"] Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.095899 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rc8gl" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.100075 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.100192 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.101007 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.101622 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.101658 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.101671 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.101714 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.101729 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:39Z","lastTransitionTime":"2025-10-01T12:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.102180 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.116031 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.130198 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.145489 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.163850 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.176098 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.187110 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.200335 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.204566 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.204612 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.204626 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.204647 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.204663 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:39Z","lastTransitionTime":"2025-10-01T12:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.213216 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.238833 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.255690 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.262368 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j52ts\" (UniqueName: \"kubernetes.io/projected/2437d3f1-5aab-476a-9c9d-16781db5aa71-kube-api-access-j52ts\") pod \"node-ca-rc8gl\" (UID: \"2437d3f1-5aab-476a-9c9d-16781db5aa71\") " pod="openshift-image-registry/node-ca-rc8gl" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.262432 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2437d3f1-5aab-476a-9c9d-16781db5aa71-host\") pod \"node-ca-rc8gl\" (UID: \"2437d3f1-5aab-476a-9c9d-16781db5aa71\") " pod="openshift-image-registry/node-ca-rc8gl" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.262469 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2437d3f1-5aab-476a-9c9d-16781db5aa71-serviceca\") pod \"node-ca-rc8gl\" (UID: \"2437d3f1-5aab-476a-9c9d-16781db5aa71\") " pod="openshift-image-registry/node-ca-rc8gl" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.267634 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.282927 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.296388 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.307364 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.307412 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.307420 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.307442 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.307457 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:39Z","lastTransitionTime":"2025-10-01T12:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.311650 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.328008 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.328075 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:39 crc kubenswrapper[4851]: E1001 12:53:39.328172 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:39 crc kubenswrapper[4851]: E1001 12:53:39.328334 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.328199 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:39 crc kubenswrapper[4851]: E1001 12:53:39.328485 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.363151 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j52ts\" (UniqueName: \"kubernetes.io/projected/2437d3f1-5aab-476a-9c9d-16781db5aa71-kube-api-access-j52ts\") pod \"node-ca-rc8gl\" (UID: \"2437d3f1-5aab-476a-9c9d-16781db5aa71\") " pod="openshift-image-registry/node-ca-rc8gl" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.363197 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2437d3f1-5aab-476a-9c9d-16781db5aa71-host\") pod \"node-ca-rc8gl\" (UID: \"2437d3f1-5aab-476a-9c9d-16781db5aa71\") " pod="openshift-image-registry/node-ca-rc8gl" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.363228 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2437d3f1-5aab-476a-9c9d-16781db5aa71-serviceca\") pod \"node-ca-rc8gl\" (UID: \"2437d3f1-5aab-476a-9c9d-16781db5aa71\") " pod="openshift-image-registry/node-ca-rc8gl" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.363343 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2437d3f1-5aab-476a-9c9d-16781db5aa71-host\") pod \"node-ca-rc8gl\" (UID: \"2437d3f1-5aab-476a-9c9d-16781db5aa71\") " pod="openshift-image-registry/node-ca-rc8gl" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.364201 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2437d3f1-5aab-476a-9c9d-16781db5aa71-serviceca\") pod \"node-ca-rc8gl\" (UID: \"2437d3f1-5aab-476a-9c9d-16781db5aa71\") " pod="openshift-image-registry/node-ca-rc8gl" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.395164 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j52ts\" (UniqueName: \"kubernetes.io/projected/2437d3f1-5aab-476a-9c9d-16781db5aa71-kube-api-access-j52ts\") pod \"node-ca-rc8gl\" (UID: \"2437d3f1-5aab-476a-9c9d-16781db5aa71\") " pod="openshift-image-registry/node-ca-rc8gl" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.409855 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.409920 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.409941 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.409968 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.409987 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:39Z","lastTransitionTime":"2025-10-01T12:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.410970 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rc8gl" Oct 01 12:53:39 crc kubenswrapper[4851]: W1001 12:53:39.424798 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2437d3f1_5aab_476a_9c9d_16781db5aa71.slice/crio-b479de566b73b5f3a4272af0ddc581809b0548013bae8a3781d89ab90b3854e0 WatchSource:0}: Error finding container b479de566b73b5f3a4272af0ddc581809b0548013bae8a3781d89ab90b3854e0: Status 404 returned error can't find the container with id b479de566b73b5f3a4272af0ddc581809b0548013bae8a3781d89ab90b3854e0 Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.513733 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.513783 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.513795 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.513819 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.513835 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:39Z","lastTransitionTime":"2025-10-01T12:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.536967 4851 generic.go:334] "Generic (PLEG): container finished" podID="3e98556a-d5e5-44d6-ad39-d13303fc263c" containerID="a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac" exitCode=0 Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.537098 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" event={"ID":"3e98556a-d5e5-44d6-ad39-d13303fc263c","Type":"ContainerDied","Data":"a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac"} Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.542071 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rc8gl" event={"ID":"2437d3f1-5aab-476a-9c9d-16781db5aa71","Type":"ContainerStarted","Data":"b479de566b73b5f3a4272af0ddc581809b0548013bae8a3781d89ab90b3854e0"} Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.552242 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.558928 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerStarted","Data":"e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69"} Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.564336 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.575438 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.585633 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.603077 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.615074 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.617532 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.617579 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.617592 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.617612 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.617628 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:39Z","lastTransitionTime":"2025-10-01T12:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.628078 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.641064 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.653899 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.681773 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.698610 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.711304 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.720924 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.720989 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.721007 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.721038 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.721057 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:39Z","lastTransitionTime":"2025-10-01T12:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.724028 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.736092 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.825366 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.825411 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.825422 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.825443 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.825456 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:39Z","lastTransitionTime":"2025-10-01T12:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.928650 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.928714 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.928728 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.928751 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:39 crc kubenswrapper[4851]: I1001 12:53:39.928767 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:39Z","lastTransitionTime":"2025-10-01T12:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.032552 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.032598 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.032610 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.032629 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.032642 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:40Z","lastTransitionTime":"2025-10-01T12:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.072075 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.072221 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.072260 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.072290 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.072323 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:40 crc kubenswrapper[4851]: E1001 12:53:40.072446 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:53:40 crc kubenswrapper[4851]: E1001 12:53:40.072468 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:53:40 crc kubenswrapper[4851]: E1001 12:53:40.072482 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:53:40 crc kubenswrapper[4851]: E1001 12:53:40.072488 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:53:40 crc kubenswrapper[4851]: E1001 12:53:40.072515 4851 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:40 crc kubenswrapper[4851]: E1001 12:53:40.072526 4851 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:40 crc kubenswrapper[4851]: E1001 12:53:40.072446 4851 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:53:40 crc kubenswrapper[4851]: E1001 12:53:40.072573 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:48.072554899 +0000 UTC m=+36.417672395 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:40 crc kubenswrapper[4851]: E1001 12:53:40.072595 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:48.07258589 +0000 UTC m=+36.417703386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:40 crc kubenswrapper[4851]: E1001 12:53:40.072611 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:48.07260375 +0000 UTC m=+36.417721246 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:53:40 crc kubenswrapper[4851]: E1001 12:53:40.072615 4851 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:53:40 crc kubenswrapper[4851]: E1001 12:53:40.072689 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:53:48.072622231 +0000 UTC m=+36.417739767 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:53:40 crc kubenswrapper[4851]: E1001 12:53:40.072799 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:48.072772065 +0000 UTC m=+36.417889811 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.136786 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.136842 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.136861 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.136927 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.136955 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:40Z","lastTransitionTime":"2025-10-01T12:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.240546 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.240601 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.240619 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.240649 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.240668 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:40Z","lastTransitionTime":"2025-10-01T12:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.343109 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.343164 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.343181 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.343207 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.343227 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:40Z","lastTransitionTime":"2025-10-01T12:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.446996 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.447082 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.447110 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.447139 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.447157 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:40Z","lastTransitionTime":"2025-10-01T12:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.550024 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.550535 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.550560 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.550595 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.550614 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:40Z","lastTransitionTime":"2025-10-01T12:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.563775 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rc8gl" event={"ID":"2437d3f1-5aab-476a-9c9d-16781db5aa71","Type":"ContainerStarted","Data":"3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69"} Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.568253 4851 generic.go:334] "Generic (PLEG): container finished" podID="3e98556a-d5e5-44d6-ad39-d13303fc263c" containerID="a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6" exitCode=0 Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.568292 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" event={"ID":"3e98556a-d5e5-44d6-ad39-d13303fc263c","Type":"ContainerDied","Data":"a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6"} Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.589032 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.606181 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.618320 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.635105 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.650944 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.653118 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.653145 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.653183 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.653199 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.653223 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:40Z","lastTransitionTime":"2025-10-01T12:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.675263 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.718408 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.749645 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.758493 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.758564 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.758581 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.758601 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.758614 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:40Z","lastTransitionTime":"2025-10-01T12:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.765240 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.787164 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.800670 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.820910 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.832969 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.844267 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.857995 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.861861 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.862197 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.862209 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.862239 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.862250 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:40Z","lastTransitionTime":"2025-10-01T12:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.874776 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.889835 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.904249 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.917536 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.930218 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.941763 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.958229 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.965186 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.965218 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.965230 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.965246 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.965258 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:40Z","lastTransitionTime":"2025-10-01T12:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.972040 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:40 crc kubenswrapper[4851]: I1001 12:53:40.986442 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.004981 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.029289 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.046691 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.062799 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.067554 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.067620 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.067637 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.067662 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.067678 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:41Z","lastTransitionTime":"2025-10-01T12:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.170637 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.170723 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.170750 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.170785 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.170809 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:41Z","lastTransitionTime":"2025-10-01T12:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.272778 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.272806 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.272815 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.272828 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.272837 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:41Z","lastTransitionTime":"2025-10-01T12:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.328146 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.328169 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:41 crc kubenswrapper[4851]: E1001 12:53:41.328367 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.329329 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:41 crc kubenswrapper[4851]: E1001 12:53:41.329439 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:41 crc kubenswrapper[4851]: E1001 12:53:41.329565 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.377066 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.377127 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.377151 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.377180 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.377206 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:41Z","lastTransitionTime":"2025-10-01T12:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.480153 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.480191 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.480203 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.480217 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.480228 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:41Z","lastTransitionTime":"2025-10-01T12:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.576414 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerStarted","Data":"c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60"} Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.576790 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.585729 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.585762 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.585774 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.585790 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.585803 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:41Z","lastTransitionTime":"2025-10-01T12:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.588874 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.591898 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" event={"ID":"3e98556a-d5e5-44d6-ad39-d13303fc263c","Type":"ContainerStarted","Data":"ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe"} Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.602107 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.620830 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.624079 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.638745 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.653680 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.675694 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.693150 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.693199 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.693213 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.693231 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.693249 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:41Z","lastTransitionTime":"2025-10-01T12:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.698848 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.720314 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.734386 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.752876 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.769601 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.785487 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.796821 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.796854 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.796864 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.796884 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.796895 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:41Z","lastTransitionTime":"2025-10-01T12:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.808607 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.823107 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.837404 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.854901 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.883377 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.899069 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.899115 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.899127 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.899149 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.899161 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:41Z","lastTransitionTime":"2025-10-01T12:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.904000 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.921241 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.941347 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.962414 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:41 crc kubenswrapper[4851]: I1001 12:53:41.981849 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:41.999929 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.002187 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.002281 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.002304 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.002340 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.002362 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:42Z","lastTransitionTime":"2025-10-01T12:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.015110 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.035542 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.059386 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.078419 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.098086 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.105412 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.105477 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.105525 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.105562 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.105587 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:42Z","lastTransitionTime":"2025-10-01T12:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.208560 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.208640 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.208659 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.208690 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.208710 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:42Z","lastTransitionTime":"2025-10-01T12:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.312075 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.312149 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.312167 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.312192 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.312216 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:42Z","lastTransitionTime":"2025-10-01T12:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.356134 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.375905 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.398560 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.417021 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.417118 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.417148 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.417186 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.417224 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:42Z","lastTransitionTime":"2025-10-01T12:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.424646 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.443649 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.465803 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.491121 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.521152 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.521194 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.521205 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.521221 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.521234 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:42Z","lastTransitionTime":"2025-10-01T12:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.523408 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.537578 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.550870 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.562971 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.577017 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.590109 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.595896 4851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.596413 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.612349 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.624197 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.624271 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.624293 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.624392 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.624465 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:42Z","lastTransitionTime":"2025-10-01T12:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.625896 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.646307 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.664445 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.682627 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.703182 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.728258 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.728337 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.728358 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.728390 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.728412 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:42Z","lastTransitionTime":"2025-10-01T12:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.729994 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.750132 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.765820 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.790973 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.818392 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.833206 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.833248 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.833260 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.833278 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.833292 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:42Z","lastTransitionTime":"2025-10-01T12:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.833686 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.852875 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.872964 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.888433 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.910034 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.936608 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.936642 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.936651 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.936669 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:42 crc kubenswrapper[4851]: I1001 12:53:42.936683 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:42Z","lastTransitionTime":"2025-10-01T12:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.039348 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.039382 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.039395 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.039409 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.039421 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:43Z","lastTransitionTime":"2025-10-01T12:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.143106 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.143150 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.143160 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.143177 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.143189 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:43Z","lastTransitionTime":"2025-10-01T12:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.246640 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.246719 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.246740 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.246772 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.246792 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:43Z","lastTransitionTime":"2025-10-01T12:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.328331 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:43 crc kubenswrapper[4851]: E1001 12:53:43.328553 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.328661 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:43 crc kubenswrapper[4851]: E1001 12:53:43.328751 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.328829 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:43 crc kubenswrapper[4851]: E1001 12:53:43.328912 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.353680 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.353808 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.353817 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.353833 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.353843 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:43Z","lastTransitionTime":"2025-10-01T12:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.456750 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.456790 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.456799 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.456812 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.456822 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:43Z","lastTransitionTime":"2025-10-01T12:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.559979 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.560013 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.560025 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.560042 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.560054 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:43Z","lastTransitionTime":"2025-10-01T12:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.597914 4851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.662442 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.662480 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.662491 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.662521 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.662534 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:43Z","lastTransitionTime":"2025-10-01T12:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.765220 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.765273 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.765288 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.765308 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.765322 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:43Z","lastTransitionTime":"2025-10-01T12:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.869398 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.869465 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.869487 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.869546 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.869570 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:43Z","lastTransitionTime":"2025-10-01T12:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.971687 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.971726 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.971734 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.971749 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:43 crc kubenswrapper[4851]: I1001 12:53:43.971759 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:43Z","lastTransitionTime":"2025-10-01T12:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.075052 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.075095 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.075110 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.075124 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.075134 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:44Z","lastTransitionTime":"2025-10-01T12:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.182452 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.182526 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.182543 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.182562 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.182575 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:44Z","lastTransitionTime":"2025-10-01T12:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.286166 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.286233 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.286260 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.286292 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.286318 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:44Z","lastTransitionTime":"2025-10-01T12:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.393284 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.394156 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.394282 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.394389 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.394476 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:44Z","lastTransitionTime":"2025-10-01T12:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.497171 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.497213 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.497224 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.497241 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.497255 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:44Z","lastTransitionTime":"2025-10-01T12:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.599755 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.600794 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.600959 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.601117 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.601256 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:44Z","lastTransitionTime":"2025-10-01T12:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.603284 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/0.log" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.606822 4851 generic.go:334] "Generic (PLEG): container finished" podID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerID="c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60" exitCode=1 Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.606946 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerDied","Data":"c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60"} Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.608066 4851 scope.go:117] "RemoveContainer" containerID="c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.626005 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:44Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.649433 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:44Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.668593 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:44Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.680207 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:44Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.702594 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:44Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.704191 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.704283 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.704300 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.704324 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.704342 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:44Z","lastTransitionTime":"2025-10-01T12:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.718232 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:44Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.739127 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:44Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.771390 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:44Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.798477 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:44Z\\\",\\\"message\\\":\\\"\\\\nI1001 12:53:44.172305 6149 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1001 12:53:44.172543 6149 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1001 12:53:44.172644 6149 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:44.174752 6149 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:44.174816 6149 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 12:53:44.174766 6149 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:44.174770 6149 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:44.174948 6149 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:44.174980 6149 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:44.175123 6149 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 12:53:44.175178 6149 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 12:53:44.175260 6149 factory.go:656] Stopping watch factory\\\\nI1001 12:53:44.175313 6149 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:53:44.175371 6149 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:44.175405 6149 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:44.175437 6149 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:44Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.807537 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.807576 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.807590 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.807611 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.807624 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:44Z","lastTransitionTime":"2025-10-01T12:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.814235 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:44Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.828734 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:44Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.844399 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:44Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.863225 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:44Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.876580 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:44Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.909428 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.909483 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.909526 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.909556 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:44 crc kubenswrapper[4851]: I1001 12:53:44.909576 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:44Z","lastTransitionTime":"2025-10-01T12:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.011867 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.011899 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.011907 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.011920 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.011928 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:45Z","lastTransitionTime":"2025-10-01T12:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.114264 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.114308 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.114322 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.114344 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.114356 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:45Z","lastTransitionTime":"2025-10-01T12:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.217297 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.217353 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.217365 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.217383 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.217395 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:45Z","lastTransitionTime":"2025-10-01T12:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.320530 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.320579 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.320591 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.320610 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.320622 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:45Z","lastTransitionTime":"2025-10-01T12:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.327895 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.327963 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.327973 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:45 crc kubenswrapper[4851]: E1001 12:53:45.328067 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:45 crc kubenswrapper[4851]: E1001 12:53:45.328180 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:45 crc kubenswrapper[4851]: E1001 12:53:45.328399 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.424021 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.424081 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.424100 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.424127 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.424146 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:45Z","lastTransitionTime":"2025-10-01T12:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.526596 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.526664 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.526688 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.526721 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.526746 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:45Z","lastTransitionTime":"2025-10-01T12:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.612808 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/0.log" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.618164 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerStarted","Data":"aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb"} Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.618375 4851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.629718 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.629766 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.629779 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.629800 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.629814 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:45Z","lastTransitionTime":"2025-10-01T12:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.640770 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.668843 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.687688 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.701004 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.712024 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.723330 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.732055 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.732113 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.732137 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.732168 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.732194 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:45Z","lastTransitionTime":"2025-10-01T12:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.740814 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.760551 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.774966 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.789862 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.810606 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.831898 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.834990 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.835029 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.835041 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.835058 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.835070 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:45Z","lastTransitionTime":"2025-10-01T12:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.866387 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:44Z\\\",\\\"message\\\":\\\"\\\\nI1001 12:53:44.172305 6149 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1001 12:53:44.172543 6149 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1001 12:53:44.172644 6149 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:44.174752 6149 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:44.174816 6149 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 12:53:44.174766 6149 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:44.174770 6149 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:44.174948 6149 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:44.174980 6149 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:44.175123 6149 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 12:53:44.175178 6149 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 12:53:44.175260 6149 factory.go:656] Stopping watch factory\\\\nI1001 12:53:44.175313 6149 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:53:44.175371 6149 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:44.175405 6149 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:44.175437 6149 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.886455 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.937846 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.937904 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.937921 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.937946 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:45 crc kubenswrapper[4851]: I1001 12:53:45.937965 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:45Z","lastTransitionTime":"2025-10-01T12:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.041251 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.041327 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.041344 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.041369 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.041386 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:46Z","lastTransitionTime":"2025-10-01T12:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.144940 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.145012 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.145031 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.145061 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.145080 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:46Z","lastTransitionTime":"2025-10-01T12:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.248171 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.248225 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.248241 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.248263 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.248278 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:46Z","lastTransitionTime":"2025-10-01T12:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.328896 4851 scope.go:117] "RemoveContainer" containerID="4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.354065 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.354115 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.354129 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.354151 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.354166 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:46Z","lastTransitionTime":"2025-10-01T12:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.457953 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.457994 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.458006 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.458023 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.458036 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:46Z","lastTransitionTime":"2025-10-01T12:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.561378 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.561434 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.561445 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.561463 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.561476 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:46Z","lastTransitionTime":"2025-10-01T12:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.624195 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/1.log" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.625787 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/0.log" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.629439 4851 generic.go:334] "Generic (PLEG): container finished" podID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerID="aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb" exitCode=1 Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.629476 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerDied","Data":"aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb"} Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.629630 4851 scope.go:117] "RemoveContainer" containerID="c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.630287 4851 scope.go:117] "RemoveContainer" containerID="aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb" Oct 01 12:53:46 crc kubenswrapper[4851]: E1001 12:53:46.630454 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.646908 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:46Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.661455 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:46Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.663886 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.663935 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.663954 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.664024 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.664045 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:46Z","lastTransitionTime":"2025-10-01T12:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.681738 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:46Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.709600 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:46Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.768948 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.768981 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.768992 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.769010 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.769022 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:46Z","lastTransitionTime":"2025-10-01T12:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.772052 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:46Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.773189 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx"] Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.773792 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.776206 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.776281 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.793920 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:46Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.815647 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:46Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.836619 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:46Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.868548 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8889745c-17a6-44c0-be06-2f45a0f1316a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v7qmx\" (UID: \"8889745c-17a6-44c0-be06-2f45a0f1316a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.868654 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8889745c-17a6-44c0-be06-2f45a0f1316a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v7qmx\" (UID: \"8889745c-17a6-44c0-be06-2f45a0f1316a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.868736 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxczz\" (UniqueName: \"kubernetes.io/projected/8889745c-17a6-44c0-be06-2f45a0f1316a-kube-api-access-wxczz\") pod \"ovnkube-control-plane-749d76644c-v7qmx\" (UID: \"8889745c-17a6-44c0-be06-2f45a0f1316a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.868772 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8889745c-17a6-44c0-be06-2f45a0f1316a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v7qmx\" (UID: \"8889745c-17a6-44c0-be06-2f45a0f1316a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.870158 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:44Z\\\",\\\"message\\\":\\\"\\\\nI1001 12:53:44.172305 6149 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1001 12:53:44.172543 6149 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1001 12:53:44.172644 6149 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:44.174752 6149 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:44.174816 6149 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 12:53:44.174766 6149 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:44.174770 6149 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:44.174948 6149 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:44.174980 6149 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:44.175123 6149 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 12:53:44.175178 6149 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 12:53:44.175260 6149 factory.go:656] Stopping watch factory\\\\nI1001 12:53:44.175313 6149 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:53:44.175371 6149 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:44.175405 6149 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:44.175437 6149 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:45Z\\\",\\\"message\\\":\\\"735611 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:45.735622 6267 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:45.735628 6267 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:45.735636 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 12:53:45.735642 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 12:53:45.735706 6267 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:53:45.736729 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:45.736768 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:45.736823 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:53:45.736838 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:53:45.736858 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:45.736877 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:45.736886 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:45.736907 6267 factory.go:656] Stopping watch factory\\\\nI1001 12:53:45.736931 6267 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:46Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.874106 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.874165 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.874184 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.874212 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.874234 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:46Z","lastTransitionTime":"2025-10-01T12:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.897749 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:46Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.915552 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:46Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.936484 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:46Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.969174 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8889745c-17a6-44c0-be06-2f45a0f1316a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v7qmx\" (UID: \"8889745c-17a6-44c0-be06-2f45a0f1316a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.969521 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8889745c-17a6-44c0-be06-2f45a0f1316a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v7qmx\" (UID: \"8889745c-17a6-44c0-be06-2f45a0f1316a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.970376 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxczz\" (UniqueName: \"kubernetes.io/projected/8889745c-17a6-44c0-be06-2f45a0f1316a-kube-api-access-wxczz\") pod \"ovnkube-control-plane-749d76644c-v7qmx\" (UID: \"8889745c-17a6-44c0-be06-2f45a0f1316a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.970621 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8889745c-17a6-44c0-be06-2f45a0f1316a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v7qmx\" (UID: \"8889745c-17a6-44c0-be06-2f45a0f1316a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.971631 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8889745c-17a6-44c0-be06-2f45a0f1316a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v7qmx\" (UID: \"8889745c-17a6-44c0-be06-2f45a0f1316a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.972722 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8889745c-17a6-44c0-be06-2f45a0f1316a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v7qmx\" (UID: \"8889745c-17a6-44c0-be06-2f45a0f1316a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.973861 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:46Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.983575 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8889745c-17a6-44c0-be06-2f45a0f1316a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v7qmx\" (UID: \"8889745c-17a6-44c0-be06-2f45a0f1316a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.984031 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.984121 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.984142 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.984169 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:46 crc kubenswrapper[4851]: I1001 12:53:46.984233 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:46Z","lastTransitionTime":"2025-10-01T12:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.017104 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxczz\" (UniqueName: \"kubernetes.io/projected/8889745c-17a6-44c0-be06-2f45a0f1316a-kube-api-access-wxczz\") pod \"ovnkube-control-plane-749d76644c-v7qmx\" (UID: \"8889745c-17a6-44c0-be06-2f45a0f1316a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.041958 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.054373 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.065811 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.078470 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.090180 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.090811 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.090851 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.090864 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.090887 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.090900 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:47Z","lastTransitionTime":"2025-10-01T12:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.104710 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.120628 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.135821 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.154569 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.169161 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.180262 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.193029 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.194010 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.194093 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.194107 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.194124 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.194136 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:47Z","lastTransitionTime":"2025-10-01T12:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.220896 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:44Z\\\",\\\"message\\\":\\\"\\\\nI1001 12:53:44.172305 6149 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1001 12:53:44.172543 6149 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1001 12:53:44.172644 6149 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:44.174752 6149 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:44.174816 6149 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 12:53:44.174766 6149 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:44.174770 6149 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:44.174948 6149 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:44.174980 6149 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:44.175123 6149 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 12:53:44.175178 6149 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 12:53:44.175260 6149 factory.go:656] Stopping watch factory\\\\nI1001 12:53:44.175313 6149 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:53:44.175371 6149 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:44.175405 6149 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:44.175437 6149 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:45Z\\\",\\\"message\\\":\\\"735611 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:45.735622 6267 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:45.735628 6267 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:45.735636 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 12:53:45.735642 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 12:53:45.735706 6267 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:53:45.736729 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:45.736768 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:45.736823 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:53:45.736838 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:53:45.736858 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:45.736877 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:45.736886 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:45.736907 6267 factory.go:656] Stopping watch factory\\\\nI1001 12:53:45.736931 6267 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.238124 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.257030 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.270262 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.290041 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.296953 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.296994 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.297003 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.297015 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.297024 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:47Z","lastTransitionTime":"2025-10-01T12:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.327531 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.327578 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:47 crc kubenswrapper[4851]: E1001 12:53:47.327653 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:47 crc kubenswrapper[4851]: E1001 12:53:47.327756 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.327967 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:47 crc kubenswrapper[4851]: E1001 12:53:47.328121 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.400424 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.400472 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.400490 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.400541 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.400559 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:47Z","lastTransitionTime":"2025-10-01T12:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.504403 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.504482 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.504525 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.504552 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.504571 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:47Z","lastTransitionTime":"2025-10-01T12:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.608285 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.608345 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.608363 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.608387 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.608415 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:47Z","lastTransitionTime":"2025-10-01T12:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.636299 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" event={"ID":"8889745c-17a6-44c0-be06-2f45a0f1316a","Type":"ContainerStarted","Data":"8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a"} Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.636361 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" event={"ID":"8889745c-17a6-44c0-be06-2f45a0f1316a","Type":"ContainerStarted","Data":"152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e"} Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.636380 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" event={"ID":"8889745c-17a6-44c0-be06-2f45a0f1316a","Type":"ContainerStarted","Data":"514b705f69851291a5a4474ed3845d90309287a97de95b7ebcca755498c33ad5"} Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.639678 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/1.log" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.645212 4851 scope.go:117] "RemoveContainer" containerID="aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb" Oct 01 12:53:47 crc kubenswrapper[4851]: E1001 12:53:47.645596 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.646896 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.651544 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282"} Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.652163 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.660557 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.684458 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.710096 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c285c68a41ba38145c0bce62d61797863fc1d874335209bfc3c1150cb0d2cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:44Z\\\",\\\"message\\\":\\\"\\\\nI1001 12:53:44.172305 6149 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1001 12:53:44.172543 6149 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1001 12:53:44.172644 6149 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:44.174752 6149 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:44.174816 6149 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 12:53:44.174766 6149 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:44.174770 6149 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:44.174948 6149 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:44.174980 6149 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:44.175123 6149 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 12:53:44.175178 6149 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 12:53:44.175260 6149 factory.go:656] Stopping watch factory\\\\nI1001 12:53:44.175313 6149 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:53:44.175371 6149 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:44.175405 6149 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:44.175437 6149 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:45Z\\\",\\\"message\\\":\\\"735611 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:45.735622 6267 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:45.735628 6267 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:45.735636 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 12:53:45.735642 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 12:53:45.735706 6267 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:53:45.736729 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:45.736768 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:45.736823 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:53:45.736838 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:53:45.736858 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:45.736877 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:45.736886 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:45.736907 6267 factory.go:656] Stopping watch factory\\\\nI1001 12:53:45.736931 6267 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.711275 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.711324 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.711340 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.711359 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.711372 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:47Z","lastTransitionTime":"2025-10-01T12:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.728330 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.745180 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.759959 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.774787 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.796897 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.814708 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.814744 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.814757 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.814774 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.814786 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:47Z","lastTransitionTime":"2025-10-01T12:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.817995 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.831823 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.844117 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.858713 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.874977 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.886003 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.901069 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.918697 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.918758 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.918772 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.918802 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.918837 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:47Z","lastTransitionTime":"2025-10-01T12:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.923984 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.942450 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.958738 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.971904 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:47 crc kubenswrapper[4851]: I1001 12:53:47.986877 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:47Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.003645 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.020682 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.023515 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.023547 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.023562 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.023583 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.023598 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:48Z","lastTransitionTime":"2025-10-01T12:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.039294 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.068680 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.086858 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.087006 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.087059 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:54:04.087028206 +0000 UTC m=+52.432145702 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.087106 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.087153 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.087221 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.087229 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.087256 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.087437 4851 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.087318 4851 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.087602 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:54:04.087581032 +0000 UTC m=+52.432698548 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.087359 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.087649 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.087664 4851 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.087710 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:54:04.087693365 +0000 UTC m=+52.432810891 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.087371 4851 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.087764 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:54:04.087753217 +0000 UTC m=+52.432870733 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.087805 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:54:04.087794488 +0000 UTC m=+52.432912004 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.088918 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.111433 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.126546 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.126608 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.126635 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.126667 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.126686 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:48Z","lastTransitionTime":"2025-10-01T12:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.143786 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:45Z\\\",\\\"message\\\":\\\"735611 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:45.735622 6267 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:45.735628 6267 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:45.735636 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 12:53:45.735642 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 12:53:45.735706 6267 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:53:45.736729 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:45.736768 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:45.736823 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:53:45.736838 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:53:45.736858 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:45.736877 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:45.736886 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:45.736907 6267 factory.go:656] Stopping watch factory\\\\nI1001 12:53:45.736931 6267 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.165321 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.184760 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.202786 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.230619 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.230676 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.230700 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.230726 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.230745 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:48Z","lastTransitionTime":"2025-10-01T12:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.334335 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.334398 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.334417 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.334448 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.334465 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:48Z","lastTransitionTime":"2025-10-01T12:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.438350 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.438416 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.438435 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.438477 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.438496 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:48Z","lastTransitionTime":"2025-10-01T12:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.541223 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.541281 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.541305 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.541334 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.541356 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:48Z","lastTransitionTime":"2025-10-01T12:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.644387 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.644454 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.644471 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.644536 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.644555 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:48Z","lastTransitionTime":"2025-10-01T12:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.688044 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-75dqp"] Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.688818 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.688971 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.708719 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.732893 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.748653 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.748709 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.748727 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.748757 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.748777 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:48Z","lastTransitionTime":"2025-10-01T12:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.758225 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.778830 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.793446 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnvh2\" (UniqueName: \"kubernetes.io/projected/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-kube-api-access-hnvh2\") pod \"network-metrics-daemon-75dqp\" (UID: \"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\") " pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.793575 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs\") pod \"network-metrics-daemon-75dqp\" (UID: \"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\") " pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.800631 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.824547 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.839145 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.839208 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.839226 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.839251 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.839270 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:48Z","lastTransitionTime":"2025-10-01T12:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.849716 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.863303 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.870192 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.871660 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.872161 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.872189 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.872223 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.872247 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:48Z","lastTransitionTime":"2025-10-01T12:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.894435 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs\") pod \"network-metrics-daemon-75dqp\" (UID: \"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\") " pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.894493 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnvh2\" (UniqueName: \"kubernetes.io/projected/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-kube-api-access-hnvh2\") pod \"network-metrics-daemon-75dqp\" (UID: \"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\") " pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.894751 4851 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.894898 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs podName:8a8fe88f-cdfe-4415-98a0-4cc8f018a962 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:49.394841776 +0000 UTC m=+37.739959272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs") pod "network-metrics-daemon-75dqp" (UID: "8a8fe88f-cdfe-4415-98a0-4cc8f018a962") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.902843 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.903578 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.908102 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.908150 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.908169 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.908194 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.908214 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:48Z","lastTransitionTime":"2025-10-01T12:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.924125 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.933088 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnvh2\" (UniqueName: \"kubernetes.io/projected/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-kube-api-access-hnvh2\") pod \"network-metrics-daemon-75dqp\" (UID: \"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\") " pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.932925 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.937132 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.938351 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.938489 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.938644 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.938823 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.938939 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:48Z","lastTransitionTime":"2025-10-01T12:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.953544 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.958687 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.962863 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.963030 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.963170 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.963283 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.963385 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:48Z","lastTransitionTime":"2025-10-01T12:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.980030 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:45Z\\\",\\\"message\\\":\\\"735611 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:45.735622 6267 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:45.735628 6267 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:45.735636 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 12:53:45.735642 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 12:53:45.735706 6267 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:53:45.736729 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:45.736768 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:45.736823 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:53:45.736838 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:53:45.736858 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:45.736877 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:45.736886 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:45.736907 6267 factory.go:656] Stopping watch factory\\\\nI1001 12:53:45.736931 6267 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.982631 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:48 crc kubenswrapper[4851]: E1001 12:53:48.982893 4851 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.985164 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.985206 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.985222 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.985247 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.985266 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:48Z","lastTransitionTime":"2025-10-01T12:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:48 crc kubenswrapper[4851]: I1001 12:53:48.994481 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.008037 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:49Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.027603 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:49Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.088091 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.088151 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.088169 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.088194 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.088212 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:49Z","lastTransitionTime":"2025-10-01T12:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.191902 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.191973 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.192004 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.192035 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.192058 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:49Z","lastTransitionTime":"2025-10-01T12:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.295291 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.295607 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.295878 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.296017 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.296155 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:49Z","lastTransitionTime":"2025-10-01T12:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.328105 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:49 crc kubenswrapper[4851]: E1001 12:53:49.328482 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.328273 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:49 crc kubenswrapper[4851]: E1001 12:53:49.329204 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.328264 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:49 crc kubenswrapper[4851]: E1001 12:53:49.329709 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.400750 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs\") pod \"network-metrics-daemon-75dqp\" (UID: \"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\") " pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.401021 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.401070 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.401090 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.401114 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.401133 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:49Z","lastTransitionTime":"2025-10-01T12:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:49 crc kubenswrapper[4851]: E1001 12:53:49.401269 4851 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:53:49 crc kubenswrapper[4851]: E1001 12:53:49.401341 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs podName:8a8fe88f-cdfe-4415-98a0-4cc8f018a962 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:50.401317134 +0000 UTC m=+38.746434650 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs") pod "network-metrics-daemon-75dqp" (UID: "8a8fe88f-cdfe-4415-98a0-4cc8f018a962") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.503533 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.503588 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.503605 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.503630 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.503647 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:49Z","lastTransitionTime":"2025-10-01T12:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.607597 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.607655 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.607671 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.607695 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.607712 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:49Z","lastTransitionTime":"2025-10-01T12:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.711065 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.711131 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.711148 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.711177 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.711195 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:49Z","lastTransitionTime":"2025-10-01T12:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.814207 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.814255 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.814271 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.814294 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.814311 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:49Z","lastTransitionTime":"2025-10-01T12:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.918223 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.918281 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.918298 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.918326 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:49 crc kubenswrapper[4851]: I1001 12:53:49.918345 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:49Z","lastTransitionTime":"2025-10-01T12:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.022882 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.022947 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.022964 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.022989 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.023006 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:50Z","lastTransitionTime":"2025-10-01T12:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.126880 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.126941 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.126961 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.126987 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.127004 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:50Z","lastTransitionTime":"2025-10-01T12:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.230043 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.230368 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.230539 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.230685 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.230851 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:50Z","lastTransitionTime":"2025-10-01T12:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.327938 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:50 crc kubenswrapper[4851]: E1001 12:53:50.328194 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.335205 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.335465 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.335674 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.335811 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.335937 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:50Z","lastTransitionTime":"2025-10-01T12:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.412919 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs\") pod \"network-metrics-daemon-75dqp\" (UID: \"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\") " pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:50 crc kubenswrapper[4851]: E1001 12:53:50.413099 4851 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:53:50 crc kubenswrapper[4851]: E1001 12:53:50.413174 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs podName:8a8fe88f-cdfe-4415-98a0-4cc8f018a962 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:52.413149176 +0000 UTC m=+40.758266692 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs") pod "network-metrics-daemon-75dqp" (UID: "8a8fe88f-cdfe-4415-98a0-4cc8f018a962") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.439669 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.439729 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.439752 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.439782 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.439805 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:50Z","lastTransitionTime":"2025-10-01T12:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.543150 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.543226 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.543254 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.543287 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.543312 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:50Z","lastTransitionTime":"2025-10-01T12:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.646548 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.646634 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.646657 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.646694 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.646715 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:50Z","lastTransitionTime":"2025-10-01T12:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.749563 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.750497 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.750708 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.750909 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.751086 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:50Z","lastTransitionTime":"2025-10-01T12:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.854157 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.854219 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.854237 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.854261 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.854279 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:50Z","lastTransitionTime":"2025-10-01T12:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.957580 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.957649 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.957666 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.957693 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:50 crc kubenswrapper[4851]: I1001 12:53:50.957718 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:50Z","lastTransitionTime":"2025-10-01T12:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.060652 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.060704 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.060732 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.060758 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.060776 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:51Z","lastTransitionTime":"2025-10-01T12:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.163180 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.163221 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.163233 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.163251 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.163265 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:51Z","lastTransitionTime":"2025-10-01T12:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.266812 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.266875 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.266894 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.266920 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.266939 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:51Z","lastTransitionTime":"2025-10-01T12:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.327904 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.327988 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.328149 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:51 crc kubenswrapper[4851]: E1001 12:53:51.328131 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:51 crc kubenswrapper[4851]: E1001 12:53:51.328349 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:51 crc kubenswrapper[4851]: E1001 12:53:51.328664 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.371060 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.371135 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.371154 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.371182 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.371202 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:51Z","lastTransitionTime":"2025-10-01T12:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.475043 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.475123 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.475141 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.475168 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.475191 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:51Z","lastTransitionTime":"2025-10-01T12:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.580483 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.580581 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.580601 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.580628 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.580654 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:51Z","lastTransitionTime":"2025-10-01T12:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.683537 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.683617 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.683643 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.683674 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.683698 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:51Z","lastTransitionTime":"2025-10-01T12:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.786716 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.786798 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.786822 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.786884 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.786908 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:51Z","lastTransitionTime":"2025-10-01T12:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.890125 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.890177 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.890196 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.890221 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.890238 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:51Z","lastTransitionTime":"2025-10-01T12:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.993567 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.993624 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.993648 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.993680 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:51 crc kubenswrapper[4851]: I1001 12:53:51.993700 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:51Z","lastTransitionTime":"2025-10-01T12:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.096595 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.096676 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.096700 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.096737 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.096761 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:52Z","lastTransitionTime":"2025-10-01T12:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.199729 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.199789 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.199815 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.199841 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.199859 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:52Z","lastTransitionTime":"2025-10-01T12:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.303783 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.303840 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.303862 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.303894 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.303918 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:52Z","lastTransitionTime":"2025-10-01T12:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.328213 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:52 crc kubenswrapper[4851]: E1001 12:53:52.328426 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.351075 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.369387 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.388396 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.407238 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.407322 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.407345 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.407385 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.407407 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:52Z","lastTransitionTime":"2025-10-01T12:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.412670 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.441585 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs\") pod \"network-metrics-daemon-75dqp\" (UID: \"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\") " pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:52 crc kubenswrapper[4851]: E1001 12:53:52.441764 4851 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:53:52 crc kubenswrapper[4851]: E1001 12:53:52.441846 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs podName:8a8fe88f-cdfe-4415-98a0-4cc8f018a962 nodeName:}" failed. No retries permitted until 2025-10-01 12:53:56.441821683 +0000 UTC m=+44.786939209 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs") pod "network-metrics-daemon-75dqp" (UID: "8a8fe88f-cdfe-4415-98a0-4cc8f018a962") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.445056 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:45Z\\\",\\\"message\\\":\\\"735611 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:45.735622 6267 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:45.735628 6267 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:45.735636 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 12:53:45.735642 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 12:53:45.735706 6267 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:53:45.736729 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:45.736768 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:45.736823 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:53:45.736838 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:53:45.736858 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:45.736877 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:45.736886 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:45.736907 6267 factory.go:656] Stopping watch factory\\\\nI1001 12:53:45.736931 6267 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.463767 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.481651 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.504396 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.510617 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.510677 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.510694 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.510720 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.510738 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:52Z","lastTransitionTime":"2025-10-01T12:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.524176 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.556932 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.580224 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.596805 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.611746 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.615311 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.615391 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.615419 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.615452 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.615475 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:52Z","lastTransitionTime":"2025-10-01T12:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.630405 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.647595 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.667078 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.718415 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.718470 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.718486 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.718534 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.718552 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:52Z","lastTransitionTime":"2025-10-01T12:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.821852 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.821897 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.821908 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.821928 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.821941 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:52Z","lastTransitionTime":"2025-10-01T12:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.925551 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.925629 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.925652 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.925680 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:52 crc kubenswrapper[4851]: I1001 12:53:52.925699 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:52Z","lastTransitionTime":"2025-10-01T12:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.028326 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.028385 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.028401 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.028426 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.028446 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:53Z","lastTransitionTime":"2025-10-01T12:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.132345 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.132406 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.132423 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.132449 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.132469 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:53Z","lastTransitionTime":"2025-10-01T12:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.237084 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.237151 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.237168 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.237197 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.237214 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:53Z","lastTransitionTime":"2025-10-01T12:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.328315 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.328466 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.328611 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:53 crc kubenswrapper[4851]: E1001 12:53:53.328540 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:53 crc kubenswrapper[4851]: E1001 12:53:53.328756 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:53 crc kubenswrapper[4851]: E1001 12:53:53.328978 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.340221 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.340277 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.340289 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.340305 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.340316 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:53Z","lastTransitionTime":"2025-10-01T12:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.443400 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.443526 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.443545 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.443570 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.443587 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:53Z","lastTransitionTime":"2025-10-01T12:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.546878 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.546953 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.546973 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.547007 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.547028 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:53Z","lastTransitionTime":"2025-10-01T12:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.650117 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.650203 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.650223 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.650256 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.650276 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:53Z","lastTransitionTime":"2025-10-01T12:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.753550 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.753634 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.753659 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.753698 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.753723 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:53Z","lastTransitionTime":"2025-10-01T12:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.856301 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.856354 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.856367 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.856384 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.856396 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:53Z","lastTransitionTime":"2025-10-01T12:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.959974 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.960037 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.960054 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.960081 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:53 crc kubenswrapper[4851]: I1001 12:53:53.960105 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:53Z","lastTransitionTime":"2025-10-01T12:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.063254 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.063338 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.063369 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.063399 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.063422 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:54Z","lastTransitionTime":"2025-10-01T12:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.166317 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.166395 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.166419 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.166450 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.166472 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:54Z","lastTransitionTime":"2025-10-01T12:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.269004 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.269068 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.269086 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.269108 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.269125 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:54Z","lastTransitionTime":"2025-10-01T12:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.327920 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:54 crc kubenswrapper[4851]: E1001 12:53:54.328120 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.371616 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.371679 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.371696 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.371720 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.371736 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:54Z","lastTransitionTime":"2025-10-01T12:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.475533 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.475607 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.475635 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.475668 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.475698 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:54Z","lastTransitionTime":"2025-10-01T12:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.578412 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.578458 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.578469 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.578484 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.578512 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:54Z","lastTransitionTime":"2025-10-01T12:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.684581 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.684948 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.685121 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.685331 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.685480 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:54Z","lastTransitionTime":"2025-10-01T12:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.789197 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.789277 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.789292 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.789318 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.789330 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:54Z","lastTransitionTime":"2025-10-01T12:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.893066 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.893116 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.893131 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.893149 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.893162 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:54Z","lastTransitionTime":"2025-10-01T12:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.995909 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.995946 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.995955 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.995968 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:54 crc kubenswrapper[4851]: I1001 12:53:54.995978 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:54Z","lastTransitionTime":"2025-10-01T12:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.098767 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.098804 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.098815 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.098831 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.098842 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:55Z","lastTransitionTime":"2025-10-01T12:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.202017 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.202122 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.202143 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.202165 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.202181 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:55Z","lastTransitionTime":"2025-10-01T12:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.304959 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.304998 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.305011 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.305027 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.305039 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:55Z","lastTransitionTime":"2025-10-01T12:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.328334 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.328334 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.328492 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:55 crc kubenswrapper[4851]: E1001 12:53:55.328646 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:55 crc kubenswrapper[4851]: E1001 12:53:55.328745 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:55 crc kubenswrapper[4851]: E1001 12:53:55.328894 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.400793 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.401603 4851 scope.go:117] "RemoveContainer" containerID="aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb" Oct 01 12:53:55 crc kubenswrapper[4851]: E1001 12:53:55.401774 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.408705 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.408762 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.408780 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.408808 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.408849 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:55Z","lastTransitionTime":"2025-10-01T12:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.511777 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.511837 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.511854 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.511877 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.511898 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:55Z","lastTransitionTime":"2025-10-01T12:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.614708 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.615639 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.615670 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.615718 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.615744 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:55Z","lastTransitionTime":"2025-10-01T12:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.718725 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.718761 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.718772 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.718787 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.718799 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:55Z","lastTransitionTime":"2025-10-01T12:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.821374 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.821446 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.821468 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.821549 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.821575 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:55Z","lastTransitionTime":"2025-10-01T12:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.924582 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.924622 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.924633 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.924650 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:55 crc kubenswrapper[4851]: I1001 12:53:55.924662 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:55Z","lastTransitionTime":"2025-10-01T12:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.027216 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.027279 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.027299 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.027327 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.027344 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:56Z","lastTransitionTime":"2025-10-01T12:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.130948 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.131022 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.131045 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.131076 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.131096 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:56Z","lastTransitionTime":"2025-10-01T12:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.234664 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.234730 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.234755 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.234785 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.234810 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:56Z","lastTransitionTime":"2025-10-01T12:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.327876 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:56 crc kubenswrapper[4851]: E1001 12:53:56.328081 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.336927 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.336981 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.336999 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.337021 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.337041 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:56Z","lastTransitionTime":"2025-10-01T12:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.440262 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.440361 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.440415 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.440460 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.440484 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:56Z","lastTransitionTime":"2025-10-01T12:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.492656 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs\") pod \"network-metrics-daemon-75dqp\" (UID: \"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\") " pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:56 crc kubenswrapper[4851]: E1001 12:53:56.492831 4851 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:53:56 crc kubenswrapper[4851]: E1001 12:53:56.492914 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs podName:8a8fe88f-cdfe-4415-98a0-4cc8f018a962 nodeName:}" failed. No retries permitted until 2025-10-01 12:54:04.49288962 +0000 UTC m=+52.838007146 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs") pod "network-metrics-daemon-75dqp" (UID: "8a8fe88f-cdfe-4415-98a0-4cc8f018a962") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.584550 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.584630 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.584650 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.584755 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.584781 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:56Z","lastTransitionTime":"2025-10-01T12:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.687514 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.687592 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.687607 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.687634 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.687652 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:56Z","lastTransitionTime":"2025-10-01T12:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.791262 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.791316 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.791327 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.791345 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.791359 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:56Z","lastTransitionTime":"2025-10-01T12:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.894996 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.895071 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.895089 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.895116 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.895134 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:56Z","lastTransitionTime":"2025-10-01T12:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.997673 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.997737 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.997756 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.997782 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:56 crc kubenswrapper[4851]: I1001 12:53:56.997800 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:56Z","lastTransitionTime":"2025-10-01T12:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.101172 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.101243 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.101260 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.101282 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.101301 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:57Z","lastTransitionTime":"2025-10-01T12:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.212958 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.213038 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.213058 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.213093 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.213117 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:57Z","lastTransitionTime":"2025-10-01T12:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.316552 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.316621 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.316642 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.316670 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.316690 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:57Z","lastTransitionTime":"2025-10-01T12:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.327946 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:57 crc kubenswrapper[4851]: E1001 12:53:57.328149 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.329028 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:57 crc kubenswrapper[4851]: E1001 12:53:57.329150 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.329238 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:57 crc kubenswrapper[4851]: E1001 12:53:57.329324 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.421077 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.421147 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.421169 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.421198 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.421221 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:57Z","lastTransitionTime":"2025-10-01T12:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.524629 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.524676 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.524861 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.524940 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.524963 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:57Z","lastTransitionTime":"2025-10-01T12:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.627933 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.627993 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.628021 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.628049 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.628071 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:57Z","lastTransitionTime":"2025-10-01T12:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.731301 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.731365 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.731390 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.731421 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.731443 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:57Z","lastTransitionTime":"2025-10-01T12:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.834679 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.834755 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.834781 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.834810 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.834836 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:57Z","lastTransitionTime":"2025-10-01T12:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.937998 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.938085 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.938106 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.938144 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:57 crc kubenswrapper[4851]: I1001 12:53:57.938168 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:57Z","lastTransitionTime":"2025-10-01T12:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.041284 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.041340 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.041354 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.041380 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.041400 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:58Z","lastTransitionTime":"2025-10-01T12:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.144959 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.145004 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.145017 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.145040 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.145055 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:58Z","lastTransitionTime":"2025-10-01T12:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.248402 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.248470 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.248492 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.248571 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.248598 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:58Z","lastTransitionTime":"2025-10-01T12:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.328247 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:53:58 crc kubenswrapper[4851]: E1001 12:53:58.328496 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.351797 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.351862 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.351876 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.351899 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.351914 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:58Z","lastTransitionTime":"2025-10-01T12:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.455158 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.455253 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.455280 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.455319 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.455343 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:58Z","lastTransitionTime":"2025-10-01T12:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.559125 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.559227 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.559251 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.559289 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.559314 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:58Z","lastTransitionTime":"2025-10-01T12:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.661877 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.661955 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.661973 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.662002 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.662023 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:58Z","lastTransitionTime":"2025-10-01T12:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.764805 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.764880 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.764906 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.764939 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.764962 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:58Z","lastTransitionTime":"2025-10-01T12:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.867704 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.867769 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.867783 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.867809 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.867825 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:58Z","lastTransitionTime":"2025-10-01T12:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.970691 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.970777 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.970791 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.970815 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:58 crc kubenswrapper[4851]: I1001 12:53:58.970832 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:58Z","lastTransitionTime":"2025-10-01T12:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.010932 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.026945 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.036122 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.056032 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.074407 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.074493 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.074548 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.074588 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.074619 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:59Z","lastTransitionTime":"2025-10-01T12:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.089116 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:45Z\\\",\\\"message\\\":\\\"735611 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:45.735622 6267 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:45.735628 6267 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:45.735636 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 12:53:45.735642 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 12:53:45.735706 6267 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:53:45.736729 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:45.736768 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:45.736823 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:53:45.736838 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:53:45.736858 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:45.736877 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:45.736886 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:45.736907 6267 factory.go:656] Stopping watch factory\\\\nI1001 12:53:45.736931 6267 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.097214 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.097272 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.097286 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.097308 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.097324 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:59Z","lastTransitionTime":"2025-10-01T12:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.115158 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: E1001 12:53:59.121207 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.129579 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.129655 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.129678 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.129710 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.129735 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:59Z","lastTransitionTime":"2025-10-01T12:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.142835 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: E1001 12:53:59.148710 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.154035 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.154100 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.154111 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.154132 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.154146 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:59Z","lastTransitionTime":"2025-10-01T12:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.165538 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: E1001 12:53:59.173853 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.179405 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.179452 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.179470 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.179496 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.179548 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:59Z","lastTransitionTime":"2025-10-01T12:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.184570 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: E1001 12:53:59.196009 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.200814 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.200856 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.200877 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.200908 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.200931 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:59Z","lastTransitionTime":"2025-10-01T12:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.205792 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: E1001 12:53:59.218455 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: E1001 12:53:59.219155 4851 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.221564 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.221623 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.221645 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.221670 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.221694 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:59Z","lastTransitionTime":"2025-10-01T12:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.226022 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.246005 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.261920 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.279291 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.296587 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.317890 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.324367 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.324423 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.324447 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.324471 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.324490 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:59Z","lastTransitionTime":"2025-10-01T12:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.328415 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.328477 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.328440 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:53:59 crc kubenswrapper[4851]: E1001 12:53:59.328666 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:53:59 crc kubenswrapper[4851]: E1001 12:53:59.328735 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:53:59 crc kubenswrapper[4851]: E1001 12:53:59.328808 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.334995 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.347953 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:53:59Z is after 2025-08-24T17:21:41Z" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.428130 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.428181 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.428198 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.428222 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.428241 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:59Z","lastTransitionTime":"2025-10-01T12:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.530861 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.531270 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.531413 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.531591 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.531726 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:59Z","lastTransitionTime":"2025-10-01T12:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.635087 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.635131 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.635143 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.635159 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.635172 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:59Z","lastTransitionTime":"2025-10-01T12:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.738465 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.738604 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.738634 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.738674 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.738709 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:59Z","lastTransitionTime":"2025-10-01T12:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.841949 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.842024 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.842042 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.842071 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.842091 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:59Z","lastTransitionTime":"2025-10-01T12:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.944944 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.944980 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.944990 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.945006 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:53:59 crc kubenswrapper[4851]: I1001 12:53:59.945017 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:53:59Z","lastTransitionTime":"2025-10-01T12:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.047905 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.047963 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.047981 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.048008 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.048027 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:00Z","lastTransitionTime":"2025-10-01T12:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.151147 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.151192 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.151204 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.151219 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.151230 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:00Z","lastTransitionTime":"2025-10-01T12:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.253979 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.254015 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.254027 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.254044 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.254055 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:00Z","lastTransitionTime":"2025-10-01T12:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.327765 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:00 crc kubenswrapper[4851]: E1001 12:54:00.328001 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.356600 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.356656 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.356673 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.356697 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.356716 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:00Z","lastTransitionTime":"2025-10-01T12:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.462975 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.463072 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.463097 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.463127 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.463409 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:00Z","lastTransitionTime":"2025-10-01T12:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.567137 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.567206 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.567226 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.567287 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.567307 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:00Z","lastTransitionTime":"2025-10-01T12:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.671683 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.671727 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.671744 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.671769 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.671786 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:00Z","lastTransitionTime":"2025-10-01T12:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.774438 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.774527 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.774551 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.774578 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.774599 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:00Z","lastTransitionTime":"2025-10-01T12:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.877015 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.877050 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.877062 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.877083 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.877097 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:00Z","lastTransitionTime":"2025-10-01T12:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.980186 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.980222 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.980235 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.980252 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:00 crc kubenswrapper[4851]: I1001 12:54:00.980264 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:00Z","lastTransitionTime":"2025-10-01T12:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.083307 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.083341 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.083351 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.083366 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.083378 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:01Z","lastTransitionTime":"2025-10-01T12:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.185904 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.185943 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.185955 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.185972 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.185984 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:01Z","lastTransitionTime":"2025-10-01T12:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.288549 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.288595 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.288606 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.288622 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.288635 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:01Z","lastTransitionTime":"2025-10-01T12:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.327579 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.327629 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.327614 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:01 crc kubenswrapper[4851]: E1001 12:54:01.327767 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:01 crc kubenswrapper[4851]: E1001 12:54:01.327839 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:01 crc kubenswrapper[4851]: E1001 12:54:01.327921 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.391693 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.391747 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.391765 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.391787 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.391805 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:01Z","lastTransitionTime":"2025-10-01T12:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.494037 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.494100 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.494124 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.494149 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.494166 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:01Z","lastTransitionTime":"2025-10-01T12:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.596558 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.596596 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.596610 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.596628 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.596641 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:01Z","lastTransitionTime":"2025-10-01T12:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.698993 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.699060 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.699084 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.699112 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.699133 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:01Z","lastTransitionTime":"2025-10-01T12:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.801922 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.801977 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.801996 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.802022 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.802041 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:01Z","lastTransitionTime":"2025-10-01T12:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.905486 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.905779 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.905900 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.906072 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:01 crc kubenswrapper[4851]: I1001 12:54:01.906218 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:01Z","lastTransitionTime":"2025-10-01T12:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.009085 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.009126 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.009137 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.009154 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.009165 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:02Z","lastTransitionTime":"2025-10-01T12:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.079757 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.095084 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24be6f41-3569-45fc-ab82-805758988299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fafa3446313876d43dd3399cb4bcda6295ffc26df86ef9527278767327c1374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efbf0c7573d4d52655a656f8e7b194959cac7d6a16ce8ffe745d9afb60dccb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://647cbd9a63725397f8ffbd446cf836c3460abe9ddd35f3b467bb06f911b2c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.111716 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.111780 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.111799 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.111818 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.111829 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:02Z","lastTransitionTime":"2025-10-01T12:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.116720 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.130613 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.144396 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.156166 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.168811 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.181332 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.193204 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.202070 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.213762 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.213915 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.213993 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.214093 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.214167 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:02Z","lastTransitionTime":"2025-10-01T12:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.217164 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.226276 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.238786 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.251470 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.274138 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:45Z\\\",\\\"message\\\":\\\"735611 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:45.735622 6267 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:45.735628 6267 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:45.735636 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 12:53:45.735642 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 12:53:45.735706 6267 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:53:45.736729 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:45.736768 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:45.736823 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:53:45.736838 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:53:45.736858 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:45.736877 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:45.736886 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:45.736907 6267 factory.go:656] Stopping watch factory\\\\nI1001 12:53:45.736931 6267 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.286929 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.299234 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.312202 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.316972 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.317012 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.317024 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.317043 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.317058 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:02Z","lastTransitionTime":"2025-10-01T12:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.328412 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:02 crc kubenswrapper[4851]: E1001 12:54:02.328582 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.344379 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.358065 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.372559 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24be6f41-3569-45fc-ab82-805758988299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fafa3446313876d43dd3399cb4bcda6295ffc26df86ef9527278767327c1374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efbf0c7573d4d52655a656f8e7b194959cac7d6a16ce8ffe745d9afb60dccb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://647cbd9a63725397f8ffbd446cf836c3460abe9ddd35f3b467bb06f911b2c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.387357 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.403585 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.418990 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.421215 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.421270 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.421292 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.421322 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.421344 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:02Z","lastTransitionTime":"2025-10-01T12:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.443945 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.461453 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.475485 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.498790 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.515448 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.524106 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.524173 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.524196 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.524221 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.524238 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:02Z","lastTransitionTime":"2025-10-01T12:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.532731 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.550727 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.569358 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.597232 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:45Z\\\",\\\"message\\\":\\\"735611 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:45.735622 6267 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:45.735628 6267 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:45.735636 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 12:53:45.735642 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 12:53:45.735706 6267 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:53:45.736729 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:45.736768 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:45.736823 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:53:45.736838 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:53:45.736858 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:45.736877 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:45.736886 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:45.736907 6267 factory.go:656] Stopping watch factory\\\\nI1001 12:53:45.736931 6267 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.615660 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.628096 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.628202 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.628231 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.628314 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.628338 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:02Z","lastTransitionTime":"2025-10-01T12:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.632288 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.731439 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.731494 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.731547 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.731572 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.731593 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:02Z","lastTransitionTime":"2025-10-01T12:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.835149 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.835201 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.835218 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.835240 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.835262 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:02Z","lastTransitionTime":"2025-10-01T12:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.937845 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.937893 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.937912 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.937935 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:02 crc kubenswrapper[4851]: I1001 12:54:02.937953 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:02Z","lastTransitionTime":"2025-10-01T12:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.040002 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.040057 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.040066 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.040080 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.040094 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:03Z","lastTransitionTime":"2025-10-01T12:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.142619 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.142652 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.142680 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.142696 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.142707 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:03Z","lastTransitionTime":"2025-10-01T12:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.245629 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.245731 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.245756 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.245787 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.245804 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:03Z","lastTransitionTime":"2025-10-01T12:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.328044 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.328079 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.328095 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:03 crc kubenswrapper[4851]: E1001 12:54:03.328225 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:03 crc kubenswrapper[4851]: E1001 12:54:03.328436 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:03 crc kubenswrapper[4851]: E1001 12:54:03.328578 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.349259 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.349323 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.349339 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.349363 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.349380 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:03Z","lastTransitionTime":"2025-10-01T12:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.452864 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.452943 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.452962 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.452988 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.453009 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:03Z","lastTransitionTime":"2025-10-01T12:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.556044 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.556142 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.556163 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.556187 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.556209 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:03Z","lastTransitionTime":"2025-10-01T12:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.660600 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.660651 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.660667 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.660690 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.660708 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:03Z","lastTransitionTime":"2025-10-01T12:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.764029 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.764091 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.764109 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.764134 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.764151 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:03Z","lastTransitionTime":"2025-10-01T12:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.867731 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.867792 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.867809 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.867834 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.867850 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:03Z","lastTransitionTime":"2025-10-01T12:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.970822 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.970952 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.970992 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.971027 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:03 crc kubenswrapper[4851]: I1001 12:54:03.971050 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:03Z","lastTransitionTime":"2025-10-01T12:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.074003 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.074058 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.074070 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.074089 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.074103 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:04Z","lastTransitionTime":"2025-10-01T12:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.177155 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.177204 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.177234 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.177254 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.177267 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:04Z","lastTransitionTime":"2025-10-01T12:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.179809 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.179944 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.179995 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:54:36.179973338 +0000 UTC m=+84.525090834 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.180071 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.180112 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.180126 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.180143 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.180156 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.180178 4851 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.180257 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:54:36.180236256 +0000 UTC m=+84.525353782 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.180257 4851 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.180297 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.180330 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.180347 4851 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.180314 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:54:36.180299738 +0000 UTC m=+84.525417254 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.180430 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:54:36.180408621 +0000 UTC m=+84.525526197 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.180461 4851 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.180638 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:54:36.180577666 +0000 UTC m=+84.525695192 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.280869 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.280944 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.280961 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.281021 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.281041 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:04Z","lastTransitionTime":"2025-10-01T12:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.327943 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.328207 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.383843 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.383922 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.383945 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.383977 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.384018 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:04Z","lastTransitionTime":"2025-10-01T12:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.487722 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.487796 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.487810 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.487831 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.487846 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:04Z","lastTransitionTime":"2025-10-01T12:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.584993 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs\") pod \"network-metrics-daemon-75dqp\" (UID: \"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\") " pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.585211 4851 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:54:04 crc kubenswrapper[4851]: E1001 12:54:04.585295 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs podName:8a8fe88f-cdfe-4415-98a0-4cc8f018a962 nodeName:}" failed. No retries permitted until 2025-10-01 12:54:20.585272707 +0000 UTC m=+68.930390223 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs") pod "network-metrics-daemon-75dqp" (UID: "8a8fe88f-cdfe-4415-98a0-4cc8f018a962") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.590993 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.591039 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.591058 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.591081 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.591100 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:04Z","lastTransitionTime":"2025-10-01T12:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.694460 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.694563 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.694577 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.694603 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.694620 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:04Z","lastTransitionTime":"2025-10-01T12:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.798031 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.798097 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.798113 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.798139 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.798158 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:04Z","lastTransitionTime":"2025-10-01T12:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.901096 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.901175 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.901198 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.901262 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:04 crc kubenswrapper[4851]: I1001 12:54:04.901281 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:04Z","lastTransitionTime":"2025-10-01T12:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.004728 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.004818 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.004835 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.004858 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.004874 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:05Z","lastTransitionTime":"2025-10-01T12:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.108093 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.108162 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.108180 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.108206 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.108223 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:05Z","lastTransitionTime":"2025-10-01T12:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.211045 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.211111 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.211120 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.211141 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.211154 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:05Z","lastTransitionTime":"2025-10-01T12:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.314990 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.315042 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.315072 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.315100 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.315117 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:05Z","lastTransitionTime":"2025-10-01T12:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.328103 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.328186 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.328231 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:05 crc kubenswrapper[4851]: E1001 12:54:05.328272 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:05 crc kubenswrapper[4851]: E1001 12:54:05.328379 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:05 crc kubenswrapper[4851]: E1001 12:54:05.328629 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.418205 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.418670 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.418690 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.418718 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.418739 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:05Z","lastTransitionTime":"2025-10-01T12:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.521775 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.521845 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.521864 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.521889 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.521907 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:05Z","lastTransitionTime":"2025-10-01T12:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.624495 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.624600 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.624617 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.624642 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.624660 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:05Z","lastTransitionTime":"2025-10-01T12:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.727668 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.727742 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.727767 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.727800 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.727830 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:05Z","lastTransitionTime":"2025-10-01T12:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.831023 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.831081 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.831097 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.831121 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.831137 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:05Z","lastTransitionTime":"2025-10-01T12:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.933701 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.933756 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.933773 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.933795 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:05 crc kubenswrapper[4851]: I1001 12:54:05.933812 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:05Z","lastTransitionTime":"2025-10-01T12:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.037618 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.037713 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.037732 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.037756 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.037775 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:06Z","lastTransitionTime":"2025-10-01T12:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.141078 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.141145 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.141164 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.141190 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.141208 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:06Z","lastTransitionTime":"2025-10-01T12:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.244084 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.244176 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.244195 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.244220 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.244237 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:06Z","lastTransitionTime":"2025-10-01T12:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.328341 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:06 crc kubenswrapper[4851]: E1001 12:54:06.328638 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.330040 4851 scope.go:117] "RemoveContainer" containerID="aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.346224 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.346293 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.346320 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.346351 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.346377 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:06Z","lastTransitionTime":"2025-10-01T12:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.449594 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.449650 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.449667 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.449690 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.449709 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:06Z","lastTransitionTime":"2025-10-01T12:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.557759 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.557803 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.557826 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.557856 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.557877 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:06Z","lastTransitionTime":"2025-10-01T12:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.660698 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.660733 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.660745 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.660761 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.660773 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:06Z","lastTransitionTime":"2025-10-01T12:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.720639 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/1.log" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.725810 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerStarted","Data":"20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0"} Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.726321 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.746596 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.759914 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.766492 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.766562 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.766579 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.766603 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.766619 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:06Z","lastTransitionTime":"2025-10-01T12:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.776993 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.794732 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.816677 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.830851 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.856240 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.869420 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.869459 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.869470 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.869486 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.869514 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:06Z","lastTransitionTime":"2025-10-01T12:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.870342 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.888962 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.924183 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.942964 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:45Z\\\",\\\"message\\\":\\\"735611 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:45.735622 6267 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:45.735628 6267 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:45.735636 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 12:53:45.735642 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 12:53:45.735706 6267 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:53:45.736729 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:45.736768 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:45.736823 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:53:45.736838 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:53:45.736858 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:45.736877 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:45.736886 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:45.736907 6267 factory.go:656] Stopping watch factory\\\\nI1001 12:53:45.736931 6267 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.953129 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.966220 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.972007 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.972040 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.972053 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.972071 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.972083 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:06Z","lastTransitionTime":"2025-10-01T12:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.981737 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24be6f41-3569-45fc-ab82-805758988299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fafa3446313876d43dd3399cb4bcda6295ffc26df86ef9527278767327c1374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efbf0c7573d4d52655a656f8e7b194959cac7d6a16ce8ffe745d9afb60dccb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://647cbd9a63725397f8ffbd446cf836c3460abe9ddd35f3b467bb06f911b2c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:06 crc kubenswrapper[4851]: I1001 12:54:06.994344 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:06Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.005759 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.017450 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.074523 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.074567 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.074579 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.074595 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.074608 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:07Z","lastTransitionTime":"2025-10-01T12:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.177157 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.177218 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.177237 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.177261 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.177280 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:07Z","lastTransitionTime":"2025-10-01T12:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.280212 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.280314 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.280337 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.280363 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.280385 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:07Z","lastTransitionTime":"2025-10-01T12:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.328293 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:07 crc kubenswrapper[4851]: E1001 12:54:07.328562 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.328681 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.328705 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:07 crc kubenswrapper[4851]: E1001 12:54:07.328825 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:07 crc kubenswrapper[4851]: E1001 12:54:07.328941 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.383708 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.383750 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.383762 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.383780 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.383796 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:07Z","lastTransitionTime":"2025-10-01T12:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.486264 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.486309 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.486319 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.486334 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.486344 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:07Z","lastTransitionTime":"2025-10-01T12:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.588801 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.588877 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.588906 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.588940 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.588964 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:07Z","lastTransitionTime":"2025-10-01T12:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.691473 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.691577 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.691602 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.691630 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.691651 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:07Z","lastTransitionTime":"2025-10-01T12:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.730517 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/2.log" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.731384 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/1.log" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.734408 4851 generic.go:334] "Generic (PLEG): container finished" podID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerID="20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0" exitCode=1 Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.734488 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerDied","Data":"20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0"} Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.734613 4851 scope.go:117] "RemoveContainer" containerID="aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.735468 4851 scope.go:117] "RemoveContainer" containerID="20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0" Oct 01 12:54:07 crc kubenswrapper[4851]: E1001 12:54:07.735747 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.751145 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.772018 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.794445 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.794478 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.794489 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.794528 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.794541 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:07Z","lastTransitionTime":"2025-10-01T12:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.802268 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef56ebf7dfa3681f1d26d6d8f61d0932a8d61a1b5fd31e0d19dbcd614268eeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:53:45Z\\\",\\\"message\\\":\\\"735611 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 12:53:45.735622 6267 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 12:53:45.735628 6267 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 12:53:45.735636 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 12:53:45.735642 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 12:53:45.735706 6267 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:53:45.736729 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 12:53:45.736768 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 12:53:45.736823 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:53:45.736838 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:53:45.736858 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 12:53:45.736877 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 12:53:45.736886 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:53:45.736907 6267 factory.go:656] Stopping watch factory\\\\nI1001 12:53:45.736931 6267 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:07Z\\\",\\\"message\\\":\\\"ics-daemon-75dqp openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx openshift-ovn-kubernetes/ovnkube-node-s78wn openshift-dns/node-resolver-5q84l openshift-machine-config-operator/machine-config-daemon-fv72m openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-multus/multus-additional-cni-plugins-brwh5 openshift-network-operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-network-diagnostics/network-check-target-xd92c]\\\\nI1001 12:54:07.287188 6572 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1001 12:54:07.287193 6572 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed cal\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.817599 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.832307 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.848989 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.862406 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.879360 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24be6f41-3569-45fc-ab82-805758988299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fafa3446313876d43dd3399cb4bcda6295ffc26df86ef9527278767327c1374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efbf0c7573d4d52655a656f8e7b194959cac7d6a16ce8ffe745d9afb60dccb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://647cbd9a63725397f8ffbd446cf836c3460abe9ddd35f3b467bb06f911b2c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.898074 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.898146 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.898170 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.898201 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.898243 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:07Z","lastTransitionTime":"2025-10-01T12:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.901343 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.918253 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.933050 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.945052 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.961369 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.975428 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:07 crc kubenswrapper[4851]: I1001 12:54:07.987181 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:07Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.002762 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.002851 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.002863 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.002880 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.002890 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:08Z","lastTransitionTime":"2025-10-01T12:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.006390 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.021402 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.106301 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.106370 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.106393 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.106423 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.106445 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:08Z","lastTransitionTime":"2025-10-01T12:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.209320 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.209381 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.209398 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.209423 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.209441 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:08Z","lastTransitionTime":"2025-10-01T12:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.312593 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.312662 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.312684 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.312712 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.312732 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:08Z","lastTransitionTime":"2025-10-01T12:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.328033 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:08 crc kubenswrapper[4851]: E1001 12:54:08.328268 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.415913 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.415974 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.415992 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.416015 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.416032 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:08Z","lastTransitionTime":"2025-10-01T12:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.518866 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.518918 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.518935 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.518957 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.518974 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:08Z","lastTransitionTime":"2025-10-01T12:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.622000 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.622048 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.622068 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.622091 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.622108 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:08Z","lastTransitionTime":"2025-10-01T12:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.724960 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.725058 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.725076 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.725101 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.725120 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:08Z","lastTransitionTime":"2025-10-01T12:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.741693 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/2.log" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.746960 4851 scope.go:117] "RemoveContainer" containerID="20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0" Oct 01 12:54:08 crc kubenswrapper[4851]: E1001 12:54:08.747227 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.765961 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24be6f41-3569-45fc-ab82-805758988299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fafa3446313876d43dd3399cb4bcda6295ffc26df86ef9527278767327c1374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efbf0c7573d4d52655a656f8e7b194959cac7d6a16ce8ffe745d9afb60dccb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://647cbd9a63725397f8ffbd446cf836c3460abe9ddd35f3b467bb06f911b2c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.791454 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.810767 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.829617 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.829687 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.829707 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.829733 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.829750 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:08Z","lastTransitionTime":"2025-10-01T12:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.831105 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.845197 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.862874 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.881491 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.896273 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.920211 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.934088 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.934136 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.934156 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.934181 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.934198 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:08Z","lastTransitionTime":"2025-10-01T12:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.938342 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.958479 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:08 crc kubenswrapper[4851]: I1001 12:54:08.978973 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.010625 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:07Z\\\",\\\"message\\\":\\\"ics-daemon-75dqp openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx openshift-ovn-kubernetes/ovnkube-node-s78wn openshift-dns/node-resolver-5q84l openshift-machine-config-operator/machine-config-daemon-fv72m openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-multus/multus-additional-cni-plugins-brwh5 openshift-network-operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-network-diagnostics/network-check-target-xd92c]\\\\nI1001 12:54:07.287188 6572 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1001 12:54:07.287193 6572 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed cal\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:54:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.029571 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.036793 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.036846 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.036870 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.036901 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.036957 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:09Z","lastTransitionTime":"2025-10-01T12:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.046021 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.063148 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.084537 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.139549 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.139628 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.139649 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.139675 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.139693 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:09Z","lastTransitionTime":"2025-10-01T12:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.242455 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.242564 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.242583 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.242609 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.242628 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:09Z","lastTransitionTime":"2025-10-01T12:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.263069 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.263188 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.263211 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.263233 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.263250 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:09Z","lastTransitionTime":"2025-10-01T12:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:09 crc kubenswrapper[4851]: E1001 12:54:09.281387 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.286165 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.286206 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.286221 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.286244 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.286277 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:09Z","lastTransitionTime":"2025-10-01T12:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:09 crc kubenswrapper[4851]: E1001 12:54:09.306618 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.311589 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.311656 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.311673 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.311702 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.311720 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:09Z","lastTransitionTime":"2025-10-01T12:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.327675 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.327711 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.327758 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:09 crc kubenswrapper[4851]: E1001 12:54:09.327841 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:09 crc kubenswrapper[4851]: E1001 12:54:09.328014 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:09 crc kubenswrapper[4851]: E1001 12:54:09.328137 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:09 crc kubenswrapper[4851]: E1001 12:54:09.333621 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.341890 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.341954 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.341973 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.341998 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.342017 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:09Z","lastTransitionTime":"2025-10-01T12:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:09 crc kubenswrapper[4851]: E1001 12:54:09.364371 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.369429 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.369529 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.369549 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.369575 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.369595 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:09Z","lastTransitionTime":"2025-10-01T12:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:09 crc kubenswrapper[4851]: E1001 12:54:09.388072 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:09 crc kubenswrapper[4851]: E1001 12:54:09.388294 4851 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.390276 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.390315 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.390329 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.390348 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.390363 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:09Z","lastTransitionTime":"2025-10-01T12:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.493934 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.493993 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.494012 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.494038 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.494057 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:09Z","lastTransitionTime":"2025-10-01T12:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.596198 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.596252 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.596269 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.596291 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.596309 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:09Z","lastTransitionTime":"2025-10-01T12:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.699689 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.699760 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.699781 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.699806 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.699823 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:09Z","lastTransitionTime":"2025-10-01T12:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.802089 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.802174 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.802195 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.802218 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.802234 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:09Z","lastTransitionTime":"2025-10-01T12:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.905226 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.905307 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.905331 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.905361 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:09 crc kubenswrapper[4851]: I1001 12:54:09.905384 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:09Z","lastTransitionTime":"2025-10-01T12:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.008841 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.008918 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.008946 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.008974 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.008995 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:10Z","lastTransitionTime":"2025-10-01T12:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.111705 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.111768 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.111786 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.111813 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.111832 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:10Z","lastTransitionTime":"2025-10-01T12:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.215461 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.215590 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.215619 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.215645 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.215664 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:10Z","lastTransitionTime":"2025-10-01T12:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.318859 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.319337 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.319485 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.319671 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.319811 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:10Z","lastTransitionTime":"2025-10-01T12:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.327824 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:10 crc kubenswrapper[4851]: E1001 12:54:10.328215 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.423600 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.424452 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.424603 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.424704 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.424798 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:10Z","lastTransitionTime":"2025-10-01T12:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.527618 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.527709 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.527722 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.527743 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.527755 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:10Z","lastTransitionTime":"2025-10-01T12:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.630713 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.630768 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.630785 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.630978 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.631005 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:10Z","lastTransitionTime":"2025-10-01T12:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.733591 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.733656 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.733674 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.733697 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.733716 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:10Z","lastTransitionTime":"2025-10-01T12:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.836744 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.836798 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.836816 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.836840 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.836858 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:10Z","lastTransitionTime":"2025-10-01T12:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.940333 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.940689 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.940896 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.941096 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:10 crc kubenswrapper[4851]: I1001 12:54:10.941243 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:10Z","lastTransitionTime":"2025-10-01T12:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.044842 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.044899 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.044919 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.044943 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.044962 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:11Z","lastTransitionTime":"2025-10-01T12:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.148444 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.148571 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.148598 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.148632 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.148655 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:11Z","lastTransitionTime":"2025-10-01T12:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.252736 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.252794 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.252811 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.252835 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.252851 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:11Z","lastTransitionTime":"2025-10-01T12:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.328303 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:11 crc kubenswrapper[4851]: E1001 12:54:11.328468 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.328583 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:11 crc kubenswrapper[4851]: E1001 12:54:11.328665 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.328731 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:11 crc kubenswrapper[4851]: E1001 12:54:11.328820 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.355703 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.355746 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.355763 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.355785 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.355801 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:11Z","lastTransitionTime":"2025-10-01T12:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.459124 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.459181 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.459198 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.459224 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.459246 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:11Z","lastTransitionTime":"2025-10-01T12:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.563103 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.563157 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.563174 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.563197 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.563214 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:11Z","lastTransitionTime":"2025-10-01T12:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.709191 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.709223 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.709233 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.709250 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.709262 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:11Z","lastTransitionTime":"2025-10-01T12:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.813226 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.813290 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.813309 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.813332 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.813350 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:11Z","lastTransitionTime":"2025-10-01T12:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.916387 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.916440 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.916457 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.916478 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:11 crc kubenswrapper[4851]: I1001 12:54:11.916495 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:11Z","lastTransitionTime":"2025-10-01T12:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.020018 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.020110 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.020135 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.020164 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.020257 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:12Z","lastTransitionTime":"2025-10-01T12:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.122969 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.123032 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.123049 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.123075 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.123092 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:12Z","lastTransitionTime":"2025-10-01T12:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.225725 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.225820 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.225844 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.225873 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.225895 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:12Z","lastTransitionTime":"2025-10-01T12:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.327542 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:12 crc kubenswrapper[4851]: E1001 12:54:12.327782 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.328454 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.328495 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.328543 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.328566 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.328583 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:12Z","lastTransitionTime":"2025-10-01T12:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.346968 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.363874 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.382973 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.404320 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.421855 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.433038 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.433085 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.433104 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.433126 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.433143 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:12Z","lastTransitionTime":"2025-10-01T12:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.449685 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.467144 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.487260 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.504158 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.527098 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:07Z\\\",\\\"message\\\":\\\"ics-daemon-75dqp openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx openshift-ovn-kubernetes/ovnkube-node-s78wn openshift-dns/node-resolver-5q84l openshift-machine-config-operator/machine-config-daemon-fv72m openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-multus/multus-additional-cni-plugins-brwh5 openshift-network-operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-network-diagnostics/network-check-target-xd92c]\\\\nI1001 12:54:07.287188 6572 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1001 12:54:07.287193 6572 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed cal\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:54:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.535939 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.535981 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.535993 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.536012 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.536026 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:12Z","lastTransitionTime":"2025-10-01T12:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.543563 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.557920 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.570341 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.589698 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24be6f41-3569-45fc-ab82-805758988299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fafa3446313876d43dd3399cb4bcda6295ffc26df86ef9527278767327c1374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efbf0c7573d4d52655a656f8e7b194959cac7d6a16ce8ffe745d9afb60dccb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://647cbd9a63725397f8ffbd446cf836c3460abe9ddd35f3b467bb06f911b2c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.610377 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.627052 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.638695 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.638735 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.638744 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.638760 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.638773 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:12Z","lastTransitionTime":"2025-10-01T12:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.643136 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.742700 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.742765 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.742786 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.742816 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.742837 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:12Z","lastTransitionTime":"2025-10-01T12:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.846168 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.846256 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.846279 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.846310 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.846337 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:12Z","lastTransitionTime":"2025-10-01T12:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.949396 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.949483 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.949543 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.949574 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:12 crc kubenswrapper[4851]: I1001 12:54:12.949596 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:12Z","lastTransitionTime":"2025-10-01T12:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.052860 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.052926 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.052951 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.052980 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.053002 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:13Z","lastTransitionTime":"2025-10-01T12:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.156032 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.156077 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.156089 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.156107 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.156118 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:13Z","lastTransitionTime":"2025-10-01T12:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.259152 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.259191 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.259203 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.259219 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.259231 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:13Z","lastTransitionTime":"2025-10-01T12:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.327815 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:13 crc kubenswrapper[4851]: E1001 12:54:13.327936 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.328121 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:13 crc kubenswrapper[4851]: E1001 12:54:13.328191 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.328329 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:13 crc kubenswrapper[4851]: E1001 12:54:13.328391 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.361816 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.361879 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.361902 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.361929 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.361981 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:13Z","lastTransitionTime":"2025-10-01T12:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.470157 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.470219 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.470242 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.470287 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.470312 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:13Z","lastTransitionTime":"2025-10-01T12:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.573611 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.573717 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.573743 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.573818 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.573841 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:13Z","lastTransitionTime":"2025-10-01T12:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.678883 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.678976 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.679001 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.679076 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.679104 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:13Z","lastTransitionTime":"2025-10-01T12:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.782078 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.782124 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.782133 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.782146 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.782155 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:13Z","lastTransitionTime":"2025-10-01T12:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.885603 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.885656 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.885672 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.885696 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.885715 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:13Z","lastTransitionTime":"2025-10-01T12:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.989057 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.989113 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.989132 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.989155 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:13 crc kubenswrapper[4851]: I1001 12:54:13.989173 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:13Z","lastTransitionTime":"2025-10-01T12:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.092901 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.093002 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.093028 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.093061 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.093084 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:14Z","lastTransitionTime":"2025-10-01T12:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.195850 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.195900 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.195915 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.195938 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.195956 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:14Z","lastTransitionTime":"2025-10-01T12:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.298411 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.298469 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.298487 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.298542 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.298568 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:14Z","lastTransitionTime":"2025-10-01T12:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.327726 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:14 crc kubenswrapper[4851]: E1001 12:54:14.327980 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.402144 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.402199 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.402217 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.402241 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.402258 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:14Z","lastTransitionTime":"2025-10-01T12:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.505191 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.505251 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.505268 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.505294 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.505311 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:14Z","lastTransitionTime":"2025-10-01T12:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.608767 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.608827 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.608850 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.608881 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.608902 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:14Z","lastTransitionTime":"2025-10-01T12:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.711979 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.712035 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.712051 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.712076 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.712094 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:14Z","lastTransitionTime":"2025-10-01T12:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.815667 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.815702 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.815710 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.815740 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.815750 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:14Z","lastTransitionTime":"2025-10-01T12:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.919241 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.919299 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.919318 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.919343 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:14 crc kubenswrapper[4851]: I1001 12:54:14.919360 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:14Z","lastTransitionTime":"2025-10-01T12:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.022572 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.022634 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.022652 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.022679 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.022697 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:15Z","lastTransitionTime":"2025-10-01T12:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.125545 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.125604 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.125620 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.125644 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.125662 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:15Z","lastTransitionTime":"2025-10-01T12:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.229074 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.229128 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.229145 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.229173 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.229190 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:15Z","lastTransitionTime":"2025-10-01T12:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.327869 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.327956 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:15 crc kubenswrapper[4851]: E1001 12:54:15.328088 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.328096 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:15 crc kubenswrapper[4851]: E1001 12:54:15.328249 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:15 crc kubenswrapper[4851]: E1001 12:54:15.328493 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.331912 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.331948 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.331961 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.331979 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.331993 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:15Z","lastTransitionTime":"2025-10-01T12:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.434684 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.434726 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.434737 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.434753 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.434764 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:15Z","lastTransitionTime":"2025-10-01T12:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.537811 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.537903 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.537921 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.537945 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.537963 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:15Z","lastTransitionTime":"2025-10-01T12:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.641247 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.641344 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.641372 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.641404 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.641454 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:15Z","lastTransitionTime":"2025-10-01T12:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.743878 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.743921 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.743939 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.743960 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.743976 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:15Z","lastTransitionTime":"2025-10-01T12:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.846517 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.846551 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.846560 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.846574 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.846583 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:15Z","lastTransitionTime":"2025-10-01T12:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.949086 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.949139 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.949151 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.949169 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:15 crc kubenswrapper[4851]: I1001 12:54:15.949181 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:15Z","lastTransitionTime":"2025-10-01T12:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.051125 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.051168 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.051182 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.051204 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.051222 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:16Z","lastTransitionTime":"2025-10-01T12:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.153522 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.153554 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.153563 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.153576 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.153585 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:16Z","lastTransitionTime":"2025-10-01T12:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.255780 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.255829 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.255841 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.255858 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.255869 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:16Z","lastTransitionTime":"2025-10-01T12:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.327763 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:16 crc kubenswrapper[4851]: E1001 12:54:16.327901 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.357346 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.357426 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.357438 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.357453 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.357466 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:16Z","lastTransitionTime":"2025-10-01T12:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.462769 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.462801 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.462817 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.463473 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.463506 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:16Z","lastTransitionTime":"2025-10-01T12:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.566175 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.566201 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.566210 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.566222 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.566231 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:16Z","lastTransitionTime":"2025-10-01T12:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.668764 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.668790 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.668798 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.668809 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.668818 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:16Z","lastTransitionTime":"2025-10-01T12:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.771004 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.771048 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.771059 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.771073 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.771085 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:16Z","lastTransitionTime":"2025-10-01T12:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.874167 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.874209 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.874221 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.874236 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.874246 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:16Z","lastTransitionTime":"2025-10-01T12:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.977251 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.977314 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.977332 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.977357 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:16 crc kubenswrapper[4851]: I1001 12:54:16.977375 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:16Z","lastTransitionTime":"2025-10-01T12:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.079927 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.079993 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.080012 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.080036 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.080052 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:17Z","lastTransitionTime":"2025-10-01T12:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.182935 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.183011 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.183064 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.183097 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.183248 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:17Z","lastTransitionTime":"2025-10-01T12:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.285965 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.286018 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.286035 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.286059 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.286075 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:17Z","lastTransitionTime":"2025-10-01T12:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.327483 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.327584 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.327631 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:17 crc kubenswrapper[4851]: E1001 12:54:17.327786 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:17 crc kubenswrapper[4851]: E1001 12:54:17.327859 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:17 crc kubenswrapper[4851]: E1001 12:54:17.328009 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.388934 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.388994 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.389012 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.389035 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.389053 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:17Z","lastTransitionTime":"2025-10-01T12:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.491555 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.491593 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.491602 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.491615 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.491624 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:17Z","lastTransitionTime":"2025-10-01T12:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.594040 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.594084 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.594096 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.594111 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.594120 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:17Z","lastTransitionTime":"2025-10-01T12:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.696696 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.696724 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.696732 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.696747 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.696755 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:17Z","lastTransitionTime":"2025-10-01T12:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.798459 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.798539 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.798556 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.798583 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.798602 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:17Z","lastTransitionTime":"2025-10-01T12:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.901084 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.901170 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.901188 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.901215 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:17 crc kubenswrapper[4851]: I1001 12:54:17.901234 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:17Z","lastTransitionTime":"2025-10-01T12:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.003786 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.003827 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.003839 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.003855 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.003866 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:18Z","lastTransitionTime":"2025-10-01T12:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.105892 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.105933 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.105942 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.105957 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.105969 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:18Z","lastTransitionTime":"2025-10-01T12:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.207966 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.208003 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.208012 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.208026 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.208039 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:18Z","lastTransitionTime":"2025-10-01T12:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.311051 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.311110 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.311128 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.311153 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.311169 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:18Z","lastTransitionTime":"2025-10-01T12:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.328366 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:18 crc kubenswrapper[4851]: E1001 12:54:18.328630 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.413692 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.413760 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.413788 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.413815 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.413838 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:18Z","lastTransitionTime":"2025-10-01T12:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.516095 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.516147 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.516164 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.516185 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.516203 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:18Z","lastTransitionTime":"2025-10-01T12:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.618533 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.618581 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.618599 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.618621 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.618637 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:18Z","lastTransitionTime":"2025-10-01T12:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.721608 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.721665 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.721684 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.721707 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.721725 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:18Z","lastTransitionTime":"2025-10-01T12:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.823394 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.823653 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.823727 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.823822 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.823906 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:18Z","lastTransitionTime":"2025-10-01T12:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.926636 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.926676 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.926685 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.926699 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:18 crc kubenswrapper[4851]: I1001 12:54:18.926709 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:18Z","lastTransitionTime":"2025-10-01T12:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.029685 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.029768 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.029791 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.029815 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.029835 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:19Z","lastTransitionTime":"2025-10-01T12:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.131453 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.131522 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.131572 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.131596 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.131614 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:19Z","lastTransitionTime":"2025-10-01T12:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.234432 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.234483 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.234492 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.234529 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.234539 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:19Z","lastTransitionTime":"2025-10-01T12:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.327919 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:19 crc kubenswrapper[4851]: E1001 12:54:19.328231 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.327931 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.328656 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:19 crc kubenswrapper[4851]: E1001 12:54:19.328805 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:19 crc kubenswrapper[4851]: E1001 12:54:19.329037 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.329196 4851 scope.go:117] "RemoveContainer" containerID="20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0" Oct 01 12:54:19 crc kubenswrapper[4851]: E1001 12:54:19.329529 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.336814 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.336949 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.337015 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.337080 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.337135 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:19Z","lastTransitionTime":"2025-10-01T12:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.439932 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.439955 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.439964 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.439978 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.439986 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:19Z","lastTransitionTime":"2025-10-01T12:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.541811 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.541864 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.541881 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.541906 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.541923 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:19Z","lastTransitionTime":"2025-10-01T12:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.644012 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.644066 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.644083 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.644106 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.644123 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:19Z","lastTransitionTime":"2025-10-01T12:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.728961 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.728996 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.729004 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.729017 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.729025 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:19Z","lastTransitionTime":"2025-10-01T12:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:19 crc kubenswrapper[4851]: E1001 12:54:19.747593 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.751132 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.751197 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.751216 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.751242 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.751262 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:19Z","lastTransitionTime":"2025-10-01T12:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:19 crc kubenswrapper[4851]: E1001 12:54:19.772540 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.775897 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.775935 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.775949 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.775964 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.775974 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:19Z","lastTransitionTime":"2025-10-01T12:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:19 crc kubenswrapper[4851]: E1001 12:54:19.786445 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.789626 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.789660 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.789673 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.789686 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.789695 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:19Z","lastTransitionTime":"2025-10-01T12:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:19 crc kubenswrapper[4851]: E1001 12:54:19.799755 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.804865 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.804966 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.805032 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.805096 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.805159 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:19Z","lastTransitionTime":"2025-10-01T12:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:19 crc kubenswrapper[4851]: E1001 12:54:19.818845 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:19 crc kubenswrapper[4851]: E1001 12:54:19.819100 4851 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.820633 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.820660 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.820671 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.820688 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.820699 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:19Z","lastTransitionTime":"2025-10-01T12:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.923447 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.923619 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.923705 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.923774 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:19 crc kubenswrapper[4851]: I1001 12:54:19.923834 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:19Z","lastTransitionTime":"2025-10-01T12:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.025560 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.025592 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.025618 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.025633 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.025644 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:20Z","lastTransitionTime":"2025-10-01T12:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.127639 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.127679 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.127691 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.127705 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.127716 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:20Z","lastTransitionTime":"2025-10-01T12:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.230737 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.230792 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.230815 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.230841 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.230861 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:20Z","lastTransitionTime":"2025-10-01T12:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.327853 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:20 crc kubenswrapper[4851]: E1001 12:54:20.328224 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.335656 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.335698 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.335711 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.335729 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.335743 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:20Z","lastTransitionTime":"2025-10-01T12:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.344775 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.439472 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.439888 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.440029 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.440258 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.440473 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:20Z","lastTransitionTime":"2025-10-01T12:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.543306 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.543358 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.543377 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.543401 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.543417 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:20Z","lastTransitionTime":"2025-10-01T12:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:20 crc kubenswrapper[4851]: E1001 12:54:20.606549 4851 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:54:20 crc kubenswrapper[4851]: E1001 12:54:20.606634 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs podName:8a8fe88f-cdfe-4415-98a0-4cc8f018a962 nodeName:}" failed. No retries permitted until 2025-10-01 12:54:52.606615657 +0000 UTC m=+100.951733153 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs") pod "network-metrics-daemon-75dqp" (UID: "8a8fe88f-cdfe-4415-98a0-4cc8f018a962") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.607117 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs\") pod \"network-metrics-daemon-75dqp\" (UID: \"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\") " pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.645397 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.645424 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.645433 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.645446 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.645455 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:20Z","lastTransitionTime":"2025-10-01T12:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.748071 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.748125 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.748142 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.748166 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.748183 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:20Z","lastTransitionTime":"2025-10-01T12:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.850922 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.850974 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.850994 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.851018 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.851034 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:20Z","lastTransitionTime":"2025-10-01T12:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.953294 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.953325 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.953334 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.953347 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:20 crc kubenswrapper[4851]: I1001 12:54:20.953357 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:20Z","lastTransitionTime":"2025-10-01T12:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.055947 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.055994 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.056010 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.056030 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.056048 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:21Z","lastTransitionTime":"2025-10-01T12:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.158802 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.158897 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.158910 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.158929 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.158939 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:21Z","lastTransitionTime":"2025-10-01T12:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.261668 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.261724 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.261741 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.261764 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.261780 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:21Z","lastTransitionTime":"2025-10-01T12:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.327664 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.327690 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.327797 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:21 crc kubenswrapper[4851]: E1001 12:54:21.327981 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:21 crc kubenswrapper[4851]: E1001 12:54:21.328096 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:21 crc kubenswrapper[4851]: E1001 12:54:21.328277 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.364491 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.364575 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.364593 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.364619 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.364637 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:21Z","lastTransitionTime":"2025-10-01T12:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.467242 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.467325 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.467350 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.467386 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.467411 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:21Z","lastTransitionTime":"2025-10-01T12:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.569718 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.569794 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.569819 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.569851 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.569875 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:21Z","lastTransitionTime":"2025-10-01T12:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.671451 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.671489 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.671520 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.671537 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.671548 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:21Z","lastTransitionTime":"2025-10-01T12:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.774747 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.775130 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.775152 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.775181 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.775203 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:21Z","lastTransitionTime":"2025-10-01T12:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.877539 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.877592 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.877619 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.877643 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.877661 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:21Z","lastTransitionTime":"2025-10-01T12:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.980998 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.981068 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.981085 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.981110 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:21 crc kubenswrapper[4851]: I1001 12:54:21.981127 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:21Z","lastTransitionTime":"2025-10-01T12:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.083650 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.083699 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.083716 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.083739 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.083759 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:22Z","lastTransitionTime":"2025-10-01T12:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.185879 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.185963 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.185988 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.186020 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.186042 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:22Z","lastTransitionTime":"2025-10-01T12:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.288065 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.288116 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.288130 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.288151 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.288168 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:22Z","lastTransitionTime":"2025-10-01T12:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.328013 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:22 crc kubenswrapper[4851]: E1001 12:54:22.328204 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.343432 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.360565 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.378570 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:07Z\\\",\\\"message\\\":\\\"ics-daemon-75dqp openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx openshift-ovn-kubernetes/ovnkube-node-s78wn openshift-dns/node-resolver-5q84l openshift-machine-config-operator/machine-config-daemon-fv72m openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-multus/multus-additional-cni-plugins-brwh5 openshift-network-operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-network-diagnostics/network-check-target-xd92c]\\\\nI1001 12:54:07.287188 6572 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1001 12:54:07.287193 6572 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed cal\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:54:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.389310 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.390454 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.390478 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.390486 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.390519 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.390532 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:22Z","lastTransitionTime":"2025-10-01T12:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.402090 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.416147 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.431736 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.448423 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24be6f41-3569-45fc-ab82-805758988299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fafa3446313876d43dd3399cb4bcda6295ffc26df86ef9527278767327c1374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efbf0c7573d4d52655a656f8e7b194959cac7d6a16ce8ffe745d9afb60dccb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://647cbd9a63725397f8ffbd446cf836c3460abe9ddd35f3b467bb06f911b2c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.466275 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.479470 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.492630 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.492685 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.492708 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.492738 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.492760 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:22Z","lastTransitionTime":"2025-10-01T12:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.499834 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.512256 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d3461e8-2705-4809-b71a-6625721fe7a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58579541903dde948e229b4bf867706a43dcd3f6a005d845703bbd04842f0c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.525640 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.539295 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.548281 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.561704 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.571190 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.580667 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.595601 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.595639 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.595676 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.595695 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.595715 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:22Z","lastTransitionTime":"2025-10-01T12:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.697846 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.697886 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.697898 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.697914 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.697926 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:22Z","lastTransitionTime":"2025-10-01T12:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.790723 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t5vvf_f68f162a-4e04-41d2-8197-95bac24aad23/kube-multus/0.log" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.790765 4851 generic.go:334] "Generic (PLEG): container finished" podID="f68f162a-4e04-41d2-8197-95bac24aad23" containerID="929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965" exitCode=1 Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.790792 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t5vvf" event={"ID":"f68f162a-4e04-41d2-8197-95bac24aad23","Type":"ContainerDied","Data":"929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965"} Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.791094 4851 scope.go:117] "RemoveContainer" containerID="929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.799445 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.799476 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.799484 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.799520 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.799534 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:22Z","lastTransitionTime":"2025-10-01T12:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.809817 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.826601 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.837099 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24be6f41-3569-45fc-ab82-805758988299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fafa3446313876d43dd3399cb4bcda6295ffc26df86ef9527278767327c1374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efbf0c7573d4d52655a656f8e7b194959cac7d6a16ce8ffe745d9afb60dccb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://647cbd9a63725397f8ffbd446cf836c3460abe9ddd35f3b467bb06f911b2c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.850732 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.862093 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.880194 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.892796 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d3461e8-2705-4809-b71a-6625721fe7a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58579541903dde948e229b4bf867706a43dcd3f6a005d845703bbd04842f0c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.901546 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.901586 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.901596 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.901609 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.901618 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:22Z","lastTransitionTime":"2025-10-01T12:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.913397 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.935276 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.949867 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.963674 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.978322 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:22 crc kubenswrapper[4851]: I1001 12:54:22.989846 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.004670 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.004709 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.004723 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.004740 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.004751 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:23Z","lastTransitionTime":"2025-10-01T12:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.008348 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.026825 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.056426 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:07Z\\\",\\\"message\\\":\\\"ics-daemon-75dqp openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx openshift-ovn-kubernetes/ovnkube-node-s78wn openshift-dns/node-resolver-5q84l openshift-machine-config-operator/machine-config-daemon-fv72m openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-multus/multus-additional-cni-plugins-brwh5 openshift-network-operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-network-diagnostics/network-check-target-xd92c]\\\\nI1001 12:54:07.287188 6572 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1001 12:54:07.287193 6572 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed cal\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:54:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.078906 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:21Z\\\",\\\"message\\\":\\\"2025-10-01T12:53:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e03f2df-0adc-4d5c-a8d2-964de353dea5\\\\n2025-10-01T12:53:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e03f2df-0adc-4d5c-a8d2-964de353dea5 to /host/opt/cni/bin/\\\\n2025-10-01T12:53:36Z [verbose] multus-daemon started\\\\n2025-10-01T12:53:36Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:54:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.092717 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.107969 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.108044 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.108067 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.108099 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.108122 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:23Z","lastTransitionTime":"2025-10-01T12:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.210378 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.210456 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.210475 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.210535 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.210553 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:23Z","lastTransitionTime":"2025-10-01T12:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.315250 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.315303 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.315318 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.315337 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.315356 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:23Z","lastTransitionTime":"2025-10-01T12:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.327391 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.327467 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:23 crc kubenswrapper[4851]: E1001 12:54:23.327529 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:23 crc kubenswrapper[4851]: E1001 12:54:23.327669 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.328694 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:23 crc kubenswrapper[4851]: E1001 12:54:23.328787 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.417259 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.417285 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.417296 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.417311 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.417321 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:23Z","lastTransitionTime":"2025-10-01T12:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.520053 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.520086 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.520099 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.520115 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.520125 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:23Z","lastTransitionTime":"2025-10-01T12:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.621862 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.621890 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.621902 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.621915 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.621926 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:23Z","lastTransitionTime":"2025-10-01T12:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.723462 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.723486 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.723510 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.723522 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.723530 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:23Z","lastTransitionTime":"2025-10-01T12:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.795587 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t5vvf_f68f162a-4e04-41d2-8197-95bac24aad23/kube-multus/0.log" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.795657 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t5vvf" event={"ID":"f68f162a-4e04-41d2-8197-95bac24aad23","Type":"ContainerStarted","Data":"9be4185c6a9a5112475d68907478d108366f2f6bff109e7d802e8b720953ffd9"} Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.812382 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.827171 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.827208 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.827219 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.827238 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.827249 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:23Z","lastTransitionTime":"2025-10-01T12:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.830412 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.844171 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.869937 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:07Z\\\",\\\"message\\\":\\\"ics-daemon-75dqp openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx openshift-ovn-kubernetes/ovnkube-node-s78wn openshift-dns/node-resolver-5q84l openshift-machine-config-operator/machine-config-daemon-fv72m openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-multus/multus-additional-cni-plugins-brwh5 openshift-network-operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-network-diagnostics/network-check-target-xd92c]\\\\nI1001 12:54:07.287188 6572 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1001 12:54:07.287193 6572 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed cal\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:54:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.886440 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be4185c6a9a5112475d68907478d108366f2f6bff109e7d802e8b720953ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:21Z\\\",\\\"message\\\":\\\"2025-10-01T12:53:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e03f2df-0adc-4d5c-a8d2-964de353dea5\\\\n2025-10-01T12:53:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e03f2df-0adc-4d5c-a8d2-964de353dea5 to /host/opt/cni/bin/\\\\n2025-10-01T12:53:36Z [verbose] multus-daemon started\\\\n2025-10-01T12:53:36Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:54:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.900063 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.913182 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.925165 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24be6f41-3569-45fc-ab82-805758988299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fafa3446313876d43dd3399cb4bcda6295ffc26df86ef9527278767327c1374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efbf0c7573d4d52655a656f8e7b194959cac7d6a16ce8ffe745d9afb60dccb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://647cbd9a63725397f8ffbd446cf836c3460abe9ddd35f3b467bb06f911b2c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.929698 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.929717 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.929724 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.929737 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.929746 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:23Z","lastTransitionTime":"2025-10-01T12:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.935826 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.949250 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.963203 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.974755 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.984316 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:23 crc kubenswrapper[4851]: I1001 12:54:23.996629 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d3461e8-2705-4809-b71a-6625721fe7a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58579541903dde948e229b4bf867706a43dcd3f6a005d845703bbd04842f0c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.011554 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:24Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.027332 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:24Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.033863 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.033884 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.033893 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.033904 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.033913 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:24Z","lastTransitionTime":"2025-10-01T12:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.037223 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:24Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.052650 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:24Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.136258 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.136282 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.136289 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.136301 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.136310 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:24Z","lastTransitionTime":"2025-10-01T12:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.237765 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.237790 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.237798 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.237808 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.237817 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:24Z","lastTransitionTime":"2025-10-01T12:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.328841 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:24 crc kubenswrapper[4851]: E1001 12:54:24.329069 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.339247 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.339275 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.339283 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.339297 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.339308 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:24Z","lastTransitionTime":"2025-10-01T12:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.441926 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.441989 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.442011 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.442043 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.442065 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:24Z","lastTransitionTime":"2025-10-01T12:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.544535 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.544568 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.544575 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.544590 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.544599 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:24Z","lastTransitionTime":"2025-10-01T12:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.646807 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.647092 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.647244 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.647375 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.647548 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:24Z","lastTransitionTime":"2025-10-01T12:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.750370 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.750413 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.750441 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.750457 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.750466 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:24Z","lastTransitionTime":"2025-10-01T12:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.852602 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.852639 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.852647 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.852660 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.852669 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:24Z","lastTransitionTime":"2025-10-01T12:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.957083 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.957162 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.957185 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.957214 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:24 crc kubenswrapper[4851]: I1001 12:54:24.957232 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:24Z","lastTransitionTime":"2025-10-01T12:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.063994 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.064059 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.064080 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.064110 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.064131 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:25Z","lastTransitionTime":"2025-10-01T12:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.166303 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.166369 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.166393 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.166422 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.166444 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:25Z","lastTransitionTime":"2025-10-01T12:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.269828 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.269869 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.269879 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.269896 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.269909 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:25Z","lastTransitionTime":"2025-10-01T12:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.328084 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:25 crc kubenswrapper[4851]: E1001 12:54:25.328192 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.328262 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.328282 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:25 crc kubenswrapper[4851]: E1001 12:54:25.328533 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:25 crc kubenswrapper[4851]: E1001 12:54:25.328739 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.372482 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.372528 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.372539 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.372554 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.372565 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:25Z","lastTransitionTime":"2025-10-01T12:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.474840 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.474895 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.474917 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.474945 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.474969 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:25Z","lastTransitionTime":"2025-10-01T12:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.578470 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.578572 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.578590 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.578613 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.578632 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:25Z","lastTransitionTime":"2025-10-01T12:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.681537 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.681586 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.681598 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.681615 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.681629 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:25Z","lastTransitionTime":"2025-10-01T12:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.783877 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.783919 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.783928 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.783945 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.783955 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:25Z","lastTransitionTime":"2025-10-01T12:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.886364 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.886418 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.886435 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.886458 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.886475 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:25Z","lastTransitionTime":"2025-10-01T12:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.988847 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.988895 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.988915 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.988942 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:25 crc kubenswrapper[4851]: I1001 12:54:25.988964 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:25Z","lastTransitionTime":"2025-10-01T12:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.091665 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.091745 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.091764 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.091787 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.091805 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:26Z","lastTransitionTime":"2025-10-01T12:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.195006 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.195044 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.195054 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.195070 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.195080 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:26Z","lastTransitionTime":"2025-10-01T12:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.297560 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.297597 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.297609 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.297624 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.297635 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:26Z","lastTransitionTime":"2025-10-01T12:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.328330 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:26 crc kubenswrapper[4851]: E1001 12:54:26.328571 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.400219 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.400290 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.400307 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.400332 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.400350 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:26Z","lastTransitionTime":"2025-10-01T12:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.503152 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.503204 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.503222 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.503246 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.503263 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:26Z","lastTransitionTime":"2025-10-01T12:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.605996 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.606051 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.606068 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.606088 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.606106 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:26Z","lastTransitionTime":"2025-10-01T12:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.709376 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.709437 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.709457 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.709481 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.709540 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:26Z","lastTransitionTime":"2025-10-01T12:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.842222 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.842256 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.842266 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.842282 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.842293 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:26Z","lastTransitionTime":"2025-10-01T12:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.945178 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.945208 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.945219 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.945243 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:26 crc kubenswrapper[4851]: I1001 12:54:26.945254 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:26Z","lastTransitionTime":"2025-10-01T12:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.047921 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.048022 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.048047 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.048072 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.048088 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:27Z","lastTransitionTime":"2025-10-01T12:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.150639 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.150680 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.150695 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.150753 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.150770 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:27Z","lastTransitionTime":"2025-10-01T12:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.253938 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.253968 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.253980 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.254018 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.254034 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:27Z","lastTransitionTime":"2025-10-01T12:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.328363 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.328408 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:27 crc kubenswrapper[4851]: E1001 12:54:27.328585 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.328616 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:27 crc kubenswrapper[4851]: E1001 12:54:27.328803 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:27 crc kubenswrapper[4851]: E1001 12:54:27.328987 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.357309 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.357368 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.357385 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.357407 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.357422 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:27Z","lastTransitionTime":"2025-10-01T12:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.461095 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.461165 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.461187 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.461219 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.461240 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:27Z","lastTransitionTime":"2025-10-01T12:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.564088 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.564146 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.564172 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.564191 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.564201 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:27Z","lastTransitionTime":"2025-10-01T12:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.666868 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.666931 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.666948 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.666973 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.666992 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:27Z","lastTransitionTime":"2025-10-01T12:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.769687 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.769788 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.769807 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.769831 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.769848 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:27Z","lastTransitionTime":"2025-10-01T12:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.871833 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.871884 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.871903 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.871926 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.871944 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:27Z","lastTransitionTime":"2025-10-01T12:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.973981 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.974022 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.974033 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.974050 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:27 crc kubenswrapper[4851]: I1001 12:54:27.974062 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:27Z","lastTransitionTime":"2025-10-01T12:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.076943 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.077006 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.077023 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.077046 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.077061 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:28Z","lastTransitionTime":"2025-10-01T12:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.180376 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.180460 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.180487 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.180560 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.180586 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:28Z","lastTransitionTime":"2025-10-01T12:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.283790 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.283870 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.283907 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.283937 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.283961 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:28Z","lastTransitionTime":"2025-10-01T12:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.327446 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:28 crc kubenswrapper[4851]: E1001 12:54:28.327653 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.386412 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.386457 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.386472 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.386490 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.386528 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:28Z","lastTransitionTime":"2025-10-01T12:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.489900 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.489973 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.489995 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.490029 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.490053 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:28Z","lastTransitionTime":"2025-10-01T12:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.593873 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.593939 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.593961 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.593992 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.594016 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:28Z","lastTransitionTime":"2025-10-01T12:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.697099 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.697171 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.697196 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.697225 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.697247 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:28Z","lastTransitionTime":"2025-10-01T12:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.800495 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.800599 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.800621 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.800658 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.800683 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:28Z","lastTransitionTime":"2025-10-01T12:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.903052 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.903112 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.903136 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.903169 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:28 crc kubenswrapper[4851]: I1001 12:54:28.903189 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:28Z","lastTransitionTime":"2025-10-01T12:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.010848 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.010891 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.010949 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.010972 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.010984 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:29Z","lastTransitionTime":"2025-10-01T12:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.113995 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.114037 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.114048 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.114067 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.114079 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:29Z","lastTransitionTime":"2025-10-01T12:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.217472 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.217525 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.217538 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.217558 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.217570 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:29Z","lastTransitionTime":"2025-10-01T12:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.321087 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.321143 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.321160 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.321183 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.321200 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:29Z","lastTransitionTime":"2025-10-01T12:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.328431 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.328467 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.328464 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:29 crc kubenswrapper[4851]: E1001 12:54:29.328668 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:29 crc kubenswrapper[4851]: E1001 12:54:29.328756 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:29 crc kubenswrapper[4851]: E1001 12:54:29.328951 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.424716 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.424785 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.424803 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.424830 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.424848 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:29Z","lastTransitionTime":"2025-10-01T12:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.528078 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.528127 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.528144 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.528166 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.528185 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:29Z","lastTransitionTime":"2025-10-01T12:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.630556 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.630656 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.630677 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.630702 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.630719 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:29Z","lastTransitionTime":"2025-10-01T12:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.732701 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.732772 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.732794 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.732822 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.732842 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:29Z","lastTransitionTime":"2025-10-01T12:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.835457 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.835541 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.835568 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.835591 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.835608 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:29Z","lastTransitionTime":"2025-10-01T12:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.938772 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.938822 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.938840 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.938862 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:29 crc kubenswrapper[4851]: I1001 12:54:29.938879 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:29Z","lastTransitionTime":"2025-10-01T12:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.042789 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.042856 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.042874 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.042899 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.042916 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:30Z","lastTransitionTime":"2025-10-01T12:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.144126 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.144187 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.144204 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.144232 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.144249 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:30Z","lastTransitionTime":"2025-10-01T12:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:30 crc kubenswrapper[4851]: E1001 12:54:30.164755 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.169746 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.169840 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.169866 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.169893 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.169910 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:30Z","lastTransitionTime":"2025-10-01T12:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:30 crc kubenswrapper[4851]: E1001 12:54:30.190741 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.195837 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.195884 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.195902 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.195923 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.195942 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:30Z","lastTransitionTime":"2025-10-01T12:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:30 crc kubenswrapper[4851]: E1001 12:54:30.217148 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.222088 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.222133 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.222149 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.222171 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.222189 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:30Z","lastTransitionTime":"2025-10-01T12:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:30 crc kubenswrapper[4851]: E1001 12:54:30.242203 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.247835 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.247882 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.247896 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.247912 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.247925 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:30Z","lastTransitionTime":"2025-10-01T12:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:30 crc kubenswrapper[4851]: E1001 12:54:30.267491 4851 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e8d9204a-cae6-4011-96df-416039ccb8ba\\\",\\\"systemUUID\\\":\\\"676661b5-1208-494a-8f95-09660995f514\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:30 crc kubenswrapper[4851]: E1001 12:54:30.267643 4851 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.269174 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.269215 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.269231 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.269248 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.269261 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:30Z","lastTransitionTime":"2025-10-01T12:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.327320 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:30 crc kubenswrapper[4851]: E1001 12:54:30.327591 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.347110 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.371767 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.372323 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.372496 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.372705 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.372858 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:30Z","lastTransitionTime":"2025-10-01T12:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.476332 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.476776 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.476976 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.477163 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.477354 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:30Z","lastTransitionTime":"2025-10-01T12:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.580320 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.580391 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.580413 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.580440 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.580463 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:30Z","lastTransitionTime":"2025-10-01T12:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.683920 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.683989 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.684011 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.684039 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.684059 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:30Z","lastTransitionTime":"2025-10-01T12:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.786849 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.786899 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.786915 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.786938 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.786954 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:30Z","lastTransitionTime":"2025-10-01T12:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.889780 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.889840 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.889856 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.889877 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.889893 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:30Z","lastTransitionTime":"2025-10-01T12:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.992663 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.992716 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.992742 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.992768 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:30 crc kubenswrapper[4851]: I1001 12:54:30.992790 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:30Z","lastTransitionTime":"2025-10-01T12:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.096375 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.096427 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.096443 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.096463 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.096474 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:31Z","lastTransitionTime":"2025-10-01T12:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.199925 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.200011 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.200029 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.200053 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.200070 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:31Z","lastTransitionTime":"2025-10-01T12:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.303145 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.303201 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.303218 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.303243 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.303261 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:31Z","lastTransitionTime":"2025-10-01T12:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.327804 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.327865 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.327838 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:31 crc kubenswrapper[4851]: E1001 12:54:31.328009 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:31 crc kubenswrapper[4851]: E1001 12:54:31.328109 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:31 crc kubenswrapper[4851]: E1001 12:54:31.328266 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.329292 4851 scope.go:117] "RemoveContainer" containerID="20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.406550 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.406947 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.406973 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.407005 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.407027 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:31Z","lastTransitionTime":"2025-10-01T12:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.510022 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.510119 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.510137 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.510164 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.510181 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:31Z","lastTransitionTime":"2025-10-01T12:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.617095 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.617130 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.617139 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.617153 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.617165 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:31Z","lastTransitionTime":"2025-10-01T12:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.720296 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.720346 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.720363 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.720387 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.720405 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:31Z","lastTransitionTime":"2025-10-01T12:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.824737 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.824786 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.824801 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.824822 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.824839 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:31Z","lastTransitionTime":"2025-10-01T12:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.895481 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/2.log" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.902892 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerStarted","Data":"a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4"} Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.903260 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.922425 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.927561 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.927630 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.927655 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.927685 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.927707 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:31Z","lastTransitionTime":"2025-10-01T12:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.936219 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d3461e8-2705-4809-b71a-6625721fe7a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58579541903dde948e229b4bf867706a43dcd3f6a005d845703bbd04842f0c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.951114 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.966698 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:31 crc kubenswrapper[4851]: I1001 12:54:31.978560 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.002816 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.019062 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.030553 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.030585 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.030593 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.030608 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.030618 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:32Z","lastTransitionTime":"2025-10-01T12:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.033874 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.063379 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75dc921-2756-4a6b-9d2e-389577cef7d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc043765b4eb1500eb1995653ca94504b3ba1140d55de4e109fbddaa359a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4999b7eb159819f19cd0ed4827425ca03fb1b60e77e680d94de20d62a63d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b04d167e9bba0d4920689cc7a8981acd7ef4cc88083aca77412ad5bd34a29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab737dd74404443cbd18ead2cc275f4703cb01a13285f28d1cfd83c0b491dd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2f807e6f28ee67accd968e9bc3a4a27253729a8f9881a4e6047b3f35a2cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18608a1fa6b6d967a4d14d4d7bdd4a6a45156e36bd514af9e8680b69256f2add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18608a1fa6b6d967a4d14d4d7bdd4a6a45156e36bd514af9e8680b69256f2add\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d23bb696019ddd2cabfc396f5da3be99196be13e721b6ae1a761bf90df632765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d23bb696019ddd2cabfc396f5da3be99196be13e721b6ae1a761bf90df632765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f3f2e95d9263d7a363fe56839a56f9499fd18b25ef33d2633cf69f39ed3d94c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f2e95d9263d7a363fe56839a56f9499fd18b25ef33d2633cf69f39ed3d94c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.087366 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.121766 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:07Z\\\",\\\"message\\\":\\\"ics-daemon-75dqp openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx openshift-ovn-kubernetes/ovnkube-node-s78wn openshift-dns/node-resolver-5q84l openshift-machine-config-operator/machine-config-daemon-fv72m openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-multus/multus-additional-cni-plugins-brwh5 openshift-network-operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-network-diagnostics/network-check-target-xd92c]\\\\nI1001 12:54:07.287188 6572 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1001 12:54:07.287193 6572 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed cal\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:54:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.134569 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.134621 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.134637 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.134659 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.134678 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:32Z","lastTransitionTime":"2025-10-01T12:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.148193 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be4185c6a9a5112475d68907478d108366f2f6bff109e7d802e8b720953ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:21Z\\\",\\\"message\\\":\\\"2025-10-01T12:53:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e03f2df-0adc-4d5c-a8d2-964de353dea5\\\\n2025-10-01T12:53:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e03f2df-0adc-4d5c-a8d2-964de353dea5 to /host/opt/cni/bin/\\\\n2025-10-01T12:53:36Z [verbose] multus-daemon started\\\\n2025-10-01T12:53:36Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:54:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.163712 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.177043 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.189051 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.208790 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24be6f41-3569-45fc-ab82-805758988299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fafa3446313876d43dd3399cb4bcda6295ffc26df86ef9527278767327c1374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efbf0c7573d4d52655a656f8e7b194959cac7d6a16ce8ffe745d9afb60dccb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://647cbd9a63725397f8ffbd446cf836c3460abe9ddd35f3b467bb06f911b2c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.226352 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.236935 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.236999 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.237029 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.237055 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.237072 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:32Z","lastTransitionTime":"2025-10-01T12:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.242844 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.260866 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.327852 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:32 crc kubenswrapper[4851]: E1001 12:54:32.328001 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.340437 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.340481 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.340530 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.340552 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.340569 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:32Z","lastTransitionTime":"2025-10-01T12:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.351308 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.370496 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.389626 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.405222 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24be6f41-3569-45fc-ab82-805758988299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fafa3446313876d43dd3399cb4bcda6295ffc26df86ef9527278767327c1374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efbf0c7573d4d52655a656f8e7b194959cac7d6a16ce8ffe745d9afb60dccb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://647cbd9a63725397f8ffbd446cf836c3460abe9ddd35f3b467bb06f911b2c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.422167 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.443917 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.444016 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.444034 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.444059 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.444076 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:32Z","lastTransitionTime":"2025-10-01T12:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.445067 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.461792 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.479384 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.499025 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.512436 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.527953 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d3461e8-2705-4809-b71a-6625721fe7a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58579541903dde948e229b4bf867706a43dcd3f6a005d845703bbd04842f0c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.546092 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.546158 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.546172 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.546190 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.546206 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:32Z","lastTransitionTime":"2025-10-01T12:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.560405 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75dc921-2756-4a6b-9d2e-389577cef7d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc043765b4eb1500eb1995653ca94504b3ba1140d55de4e109fbddaa359a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4999b7eb159819f19cd0ed4827425ca03fb1b60e77e680d94de20d62a63d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b04d167e9bba0d4920689cc7a8981acd7ef4cc88083aca77412ad5bd34a29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab737dd74404443cbd18ead2cc275f4703cb01a13285f28d1cfd83c0b491dd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2f807e6f28ee67accd968e9bc3a4a27253729a8f9881a4e6047b3f35a2cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18608a1fa6b6d967a4d14d4d7bdd4a6a45156e36bd514af9e8680b69256f2add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18608a1fa6b6d967a4d14d4d7bdd4a6a45156e36bd514af9e8680b69256f2add\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d23bb696019ddd2cabfc396f5da3be99196be13e721b6ae1a761bf90df632765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d23bb696019ddd2cabfc396f5da3be99196be13e721b6ae1a761bf90df632765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f3f2e95d9263d7a363fe56839a56f9499fd18b25ef33d2633cf69f39ed3d94c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f2e95d9263d7a363fe56839a56f9499fd18b25ef33d2633cf69f39ed3d94c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.581108 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.611320 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:07Z\\\",\\\"message\\\":\\\"ics-daemon-75dqp openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx openshift-ovn-kubernetes/ovnkube-node-s78wn openshift-dns/node-resolver-5q84l openshift-machine-config-operator/machine-config-daemon-fv72m openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-multus/multus-additional-cni-plugins-brwh5 openshift-network-operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-network-diagnostics/network-check-target-xd92c]\\\\nI1001 12:54:07.287188 6572 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1001 12:54:07.287193 6572 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed cal\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:54:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.631285 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be4185c6a9a5112475d68907478d108366f2f6bff109e7d802e8b720953ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:21Z\\\",\\\"message\\\":\\\"2025-10-01T12:53:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e03f2df-0adc-4d5c-a8d2-964de353dea5\\\\n2025-10-01T12:53:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e03f2df-0adc-4d5c-a8d2-964de353dea5 to /host/opt/cni/bin/\\\\n2025-10-01T12:53:36Z [verbose] multus-daemon started\\\\n2025-10-01T12:53:36Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:54:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.646566 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.649114 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.649186 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.649215 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.649247 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.649273 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:32Z","lastTransitionTime":"2025-10-01T12:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.666222 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.683141 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.700870 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.751879 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.751935 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.751951 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.751975 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.751993 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:32Z","lastTransitionTime":"2025-10-01T12:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.855090 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.855142 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.855158 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.855182 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.855200 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:32Z","lastTransitionTime":"2025-10-01T12:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.909182 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/3.log" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.910393 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/2.log" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.914671 4851 generic.go:334] "Generic (PLEG): container finished" podID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerID="a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4" exitCode=1 Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.914724 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerDied","Data":"a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4"} Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.914768 4851 scope.go:117] "RemoveContainer" containerID="20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.915813 4851 scope.go:117] "RemoveContainer" containerID="a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4" Oct 01 12:54:32 crc kubenswrapper[4851]: E1001 12:54:32.919355 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.934715 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.958767 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.959258 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.959284 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.959314 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.959337 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:32Z","lastTransitionTime":"2025-10-01T12:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.968037 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75dc921-2756-4a6b-9d2e-389577cef7d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc043765b4eb1500eb1995653ca94504b3ba1140d55de4e109fbddaa359a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4999b7eb159819f19cd0ed4827425ca03fb1b60e77e680d94de20d62a63d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b04d167e9bba0d4920689cc7a8981acd7ef4cc88083aca77412ad5bd34a29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab737dd74404443cbd18ead2cc275f4703cb01a13285f28d1cfd83c0b491dd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2f807e6f28ee67accd968e9bc3a4a27253729a8f9881a4e6047b3f35a2cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18608a1fa6b6d967a4d14d4d7bdd4a6a45156e36bd514af9e8680b69256f2add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18608a1fa6b6d967a4d14d4d7bdd4a6a45156e36bd514af9e8680b69256f2add\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d23bb696019ddd2cabfc396f5da3be99196be13e721b6ae1a761bf90df632765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d23bb696019ddd2cabfc396f5da3be99196be13e721b6ae1a761bf90df632765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f3f2e95d9263d7a363fe56839a56f9499fd18b25ef33d2633cf69f39ed3d94c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f2e95d9263d7a363fe56839a56f9499fd18b25ef33d2633cf69f39ed3d94c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:32 crc kubenswrapper[4851]: I1001 12:54:32.987668 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.013260 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20e209631e3723f2f1d5027d67936d5ef6377bfb1efbbfe66acd73f77c950ca0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:07Z\\\",\\\"message\\\":\\\"ics-daemon-75dqp openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx openshift-ovn-kubernetes/ovnkube-node-s78wn openshift-dns/node-resolver-5q84l openshift-machine-config-operator/machine-config-daemon-fv72m openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-multus/multus-additional-cni-plugins-brwh5 openshift-network-operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-network-diagnostics/network-check-target-xd92c]\\\\nI1001 12:54:07.287188 6572 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1001 12:54:07.287193 6572 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed cal\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:54:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:32Z\\\",\\\"message\\\":\\\"o:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx\\\\nI1001 12:54:32.321208 6872 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-fv72m after 0 failed attempt(s)\\\\nI1001 12:54:32.321259 6872 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx in node crc\\\\nI1001 12:54:32.321264 6872 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-fv72m\\\\nI1001 12:54:32.321268 6872 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1001 12:54:32.321242 6872 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"http\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.033991 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be4185c6a9a5112475d68907478d108366f2f6bff109e7d802e8b720953ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:21Z\\\",\\\"message\\\":\\\"2025-10-01T12:53:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e03f2df-0adc-4d5c-a8d2-964de353dea5\\\\n2025-10-01T12:53:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e03f2df-0adc-4d5c-a8d2-964de353dea5 to /host/opt/cni/bin/\\\\n2025-10-01T12:53:36Z [verbose] multus-daemon started\\\\n2025-10-01T12:53:36Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:54:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.051932 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.065450 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.065553 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.065579 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.065612 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.065635 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:33Z","lastTransitionTime":"2025-10-01T12:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.070908 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.091231 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.110952 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24be6f41-3569-45fc-ab82-805758988299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fafa3446313876d43dd3399cb4bcda6295ffc26df86ef9527278767327c1374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efbf0c7573d4d52655a656f8e7b194959cac7d6a16ce8ffe745d9afb60dccb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://647cbd9a63725397f8ffbd446cf836c3460abe9ddd35f3b467bb06f911b2c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.137042 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.156273 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.169596 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.169656 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.169674 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.169698 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.169720 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:33Z","lastTransitionTime":"2025-10-01T12:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.178593 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.196113 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d3461e8-2705-4809-b71a-6625721fe7a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58579541903dde948e229b4bf867706a43dcd3f6a005d845703bbd04842f0c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.217119 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.236917 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.253765 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.274233 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.274314 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.274340 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.274374 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.274398 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:33Z","lastTransitionTime":"2025-10-01T12:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.280254 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.299479 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.325953 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.328396 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.328398 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.328480 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:33 crc kubenswrapper[4851]: E1001 12:54:33.329099 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:33 crc kubenswrapper[4851]: E1001 12:54:33.328800 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:33 crc kubenswrapper[4851]: E1001 12:54:33.329121 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.377874 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.377938 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.377957 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.377982 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.378002 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:33Z","lastTransitionTime":"2025-10-01T12:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.481001 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.481044 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.481056 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.481105 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.481126 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:33Z","lastTransitionTime":"2025-10-01T12:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.584149 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.584209 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.584226 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.584255 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.584272 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:33Z","lastTransitionTime":"2025-10-01T12:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.687404 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.687467 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.687490 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.687557 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.687582 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:33Z","lastTransitionTime":"2025-10-01T12:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.790914 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.790971 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.790988 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.791012 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.791033 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:33Z","lastTransitionTime":"2025-10-01T12:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.894622 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.894666 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.894679 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.894698 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.894714 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:33Z","lastTransitionTime":"2025-10-01T12:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.920744 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/3.log" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.926435 4851 scope.go:117] "RemoveContainer" containerID="a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4" Oct 01 12:54:33 crc kubenswrapper[4851]: E1001 12:54:33.926879 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.946264 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24be6f41-3569-45fc-ab82-805758988299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fafa3446313876d43dd3399cb4bcda6295ffc26df86ef9527278767327c1374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efbf0c7573d4d52655a656f8e7b194959cac7d6a16ce8ffe745d9afb60dccb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://647cbd9a63725397f8ffbd446cf836c3460abe9ddd35f3b467bb06f911b2c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44615d07b1a5d6aa1ecd46c0ab53ef3b763cfa2e84773c865e0590313f14a4c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.970210 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9da7223-80aa-4391-8207-0638ef8cc36d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ce96e483ea6a5496fa519bca1df66038462ba8a74a689ee348a3ed2863e7a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92ccffb577ea04c311ec630c89d843c80ec1555e13fce168d0582e1b0f87d8a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61650ec2e20253de8835f30fb435186f7eb662fb3851cf348400db252640a9fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f77a9c5ff069b1de69400b48ecf50a0f01028d046323e1da2396606c318282\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e5491530d891bebf7a56e38710b626da982be29cf4dd6d505b0dfa5470bae16\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:53:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 12:53:32.324963 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:53:32.325218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:53:32.326452 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3735134478/tls.crt::/tmp/serving-cert-3735134478/tls.key\\\\\\\"\\\\nI1001 12:53:32.678885 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:53:32.684152 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:53:32.684170 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:53:32.684194 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:53:32.684200 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:53:32.691687 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:53:32.691713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:53:32.691720 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:53:32.691716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:53:32.691728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:53:32.691735 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:53:32.691739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:53:32.691741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:53:32.694239 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4920092c8caf84b05f417625c050f30883edd5604d9deddc0da1e71285455f7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43efe60995dc3519ca41394ea74e37627c53935c9f3f8d16769a5530b65c7fa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.999461 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.999540 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:33 crc kubenswrapper[4851]: I1001 12:54:33.999558 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:33.999583 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:33.999603 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:33Z","lastTransitionTime":"2025-10-01T12:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.000732 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.027272 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931d4eb1e75619abe4759455fd15c8cd2e05848d288c8c01b0ca3be8aa687ce5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5dde226944a55daca6f38c6b0f879bd00c230c62c44615f55acda602349db1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.043906 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc8gl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2437d3f1-5aab-476a-9c9d-16781db5aa71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bbc70bd2d4dd7e743c46a6521aabcf2bddb80c58982e1e9ae6ba8f526a1fe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j52ts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc8gl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.057040 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d3461e8-2705-4809-b71a-6625721fe7a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58579541903dde948e229b4bf867706a43dcd3f6a005d845703bbd04842f0c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294ee2bdf1e056e24e8d283dd2cb54aecfc9b30083609e426f3228aed104b061\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.075015 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.090761 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.101308 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.101350 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.101366 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.101386 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.101398 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:34Z","lastTransitionTime":"2025-10-01T12:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.103476 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5q84l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9b653bb-2bc1-4a97-8699-326054117be2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f1e223194216c0e54ca6cc662ecd363ff8ce368e44be2e788025fe40dde124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q78w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5q84l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.123694 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-brwh5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e98556a-d5e5-44d6-ad39-d13303fc263c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec96383973ce9331ab9adbfb2619ee47aec252a422c9c346638b7ba94ba2eabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feacd3c7228b2143d6cf17cc78daa1c3b2f884b66e19f868a87d1d35c6f75dcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b04d29088b1c3ebea80600dc1b30ee728a8499de4f29904929cfbba1eab77f9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87ad2ed1cfd2ca9a8c2a787474af1b4c49cdcf038a0e579c3f14a1630563de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f80dc8bc48fbccf2ab9598aaed0fcfbe584d22e46e84b7a2a17051aa708adce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f64441b633664cac4a9fcbc7ad5ec7a3592b5d5d30036bbc9b4dbb3c230ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a036952a9c10be1e96ce44be7f004107b32bceb11abec4865d468186c706c6e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmv5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-brwh5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.142106 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3acff5c-c60b-4f54-acfa-5521ded8b2af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff8bae866285a25cca1d18cbd68d5b98d45c400d5a57126d95aae056a40d1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2r4dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv72m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.161050 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23a7271b-9426-4ff2-b3b4-a05522c8070e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8926b260d43ea02bfc184c40ff0df3a29b4bb77226e222296c2b488f8ffe328c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9143b12e1c7db393c603808848960518ae5360b42bbdff0e873e8860e59cf5c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c393d6d0a38a9373bafa9888c231919275b8c314c03c5f147fa1ea3822b473d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251e2bf362e06606d5fd79d5ae50ee9f45944a3f17716cda1899afe766e539e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.195928 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75dc921-2756-4a6b-9d2e-389577cef7d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc043765b4eb1500eb1995653ca94504b3ba1140d55de4e109fbddaa359a0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4999b7eb159819f19cd0ed4827425ca03fb1b60e77e680d94de20d62a63d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b04d167e9bba0d4920689cc7a8981acd7ef4cc88083aca77412ad5bd34a29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab737dd74404443cbd18ead2cc275f4703cb01a13285f28d1cfd83c0b491dd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d2f807e6f28ee67accd968e9bc3a4a27253729a8f9881a4e6047b3f35a2cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18608a1fa6b6d967a4d14d4d7bdd4a6a45156e36bd514af9e8680b69256f2add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18608a1fa6b6d967a4d14d4d7bdd4a6a45156e36bd514af9e8680b69256f2add\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d23bb696019ddd2cabfc396f5da3be99196be13e721b6ae1a761bf90df632765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d23bb696019ddd2cabfc396f5da3be99196be13e721b6ae1a761bf90df632765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f3f2e95d9263d7a363fe56839a56f9499fd18b25ef33d2633cf69f39ed3d94c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f2e95d9263d7a363fe56839a56f9499fd18b25ef33d2633cf69f39ed3d94c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.204024 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.204068 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.204087 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.204111 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.204127 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:34Z","lastTransitionTime":"2025-10-01T12:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.218295 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810135147b39ba1f0177e38c8b209ee4cc4596e34ed1ecf83e7e4f2108946fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.249108 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:32Z\\\",\\\"message\\\":\\\"o:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx\\\\nI1001 12:54:32.321208 6872 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-fv72m after 0 failed attempt(s)\\\\nI1001 12:54:32.321259 6872 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx in node crc\\\\nI1001 12:54:32.321264 6872 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-fv72m\\\\nI1001 12:54:32.321268 6872 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1001 12:54:32.321242 6872 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"http\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:54:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9jss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-s78wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.271727 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t5vvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f68f162a-4e04-41d2-8197-95bac24aad23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be4185c6a9a5112475d68907478d108366f2f6bff109e7d802e8b720953ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:54:21Z\\\",\\\"message\\\":\\\"2025-10-01T12:53:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e03f2df-0adc-4d5c-a8d2-964de353dea5\\\\n2025-10-01T12:53:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e03f2df-0adc-4d5c-a8d2-964de353dea5 to /host/opt/cni/bin/\\\\n2025-10-01T12:53:36Z [verbose] multus-daemon started\\\\n2025-10-01T12:53:36Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:54:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p4f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t5vvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.290152 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-75dqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnvh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-75dqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.334828 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.334888 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.334910 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.334921 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.334939 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.335105 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:34Z","lastTransitionTime":"2025-10-01T12:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:34 crc kubenswrapper[4851]: E1001 12:54:34.335142 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.342517 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf832a5e0641a797534588452ede5db9df052540790d726bb369520472ece7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.362201 4851 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8889745c-17a6-44c0-be06-2f45a0f1316a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152301dc887033a9f46dded0a95a56edf7395806f427e60112739910a81dbc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c74d13ddaf4b7f8d5e56156f8bf3cbae697f483f9e6507bd673f1a0858a309a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxczz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7qmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:54:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.438102 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.438160 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.438179 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.438206 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.438224 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:34Z","lastTransitionTime":"2025-10-01T12:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.540893 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.540946 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.540966 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.540992 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.541009 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:34Z","lastTransitionTime":"2025-10-01T12:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.643369 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.643435 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.643457 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.643486 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.643551 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:34Z","lastTransitionTime":"2025-10-01T12:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.746831 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.746925 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.746946 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.746974 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.746992 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:34Z","lastTransitionTime":"2025-10-01T12:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.849776 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.850076 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.850216 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.850408 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.850662 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:34Z","lastTransitionTime":"2025-10-01T12:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.953466 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.953547 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.953849 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.953974 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:34 crc kubenswrapper[4851]: I1001 12:54:34.953995 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:34Z","lastTransitionTime":"2025-10-01T12:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.057445 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.057494 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.057542 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.057564 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.057582 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:35Z","lastTransitionTime":"2025-10-01T12:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.160011 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.160068 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.160087 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.160110 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.160131 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:35Z","lastTransitionTime":"2025-10-01T12:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.262410 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.262471 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.262482 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.262522 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.262534 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:35Z","lastTransitionTime":"2025-10-01T12:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.328195 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.328265 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.328224 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:35 crc kubenswrapper[4851]: E1001 12:54:35.328379 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:35 crc kubenswrapper[4851]: E1001 12:54:35.328495 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:35 crc kubenswrapper[4851]: E1001 12:54:35.328623 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.364540 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.364593 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.364609 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.364633 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.364651 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:35Z","lastTransitionTime":"2025-10-01T12:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.467674 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.467699 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.467707 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.467719 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.467730 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:35Z","lastTransitionTime":"2025-10-01T12:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.570371 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.570433 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.570451 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.570476 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.570492 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:35Z","lastTransitionTime":"2025-10-01T12:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.672966 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.673036 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.673057 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.673089 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.673110 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:35Z","lastTransitionTime":"2025-10-01T12:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.777076 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.777119 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.777133 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.777150 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.777158 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:35Z","lastTransitionTime":"2025-10-01T12:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.879332 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.879390 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.879411 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.879442 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.879492 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:35Z","lastTransitionTime":"2025-10-01T12:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.982287 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.982347 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.982366 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.982389 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:35 crc kubenswrapper[4851]: I1001 12:54:35.982405 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:35Z","lastTransitionTime":"2025-10-01T12:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.085781 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.085846 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.085860 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.085880 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.085895 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:36Z","lastTransitionTime":"2025-10-01T12:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.189271 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.189336 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.189362 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.189391 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.189412 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:36Z","lastTransitionTime":"2025-10-01T12:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.270175 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:54:36 crc kubenswrapper[4851]: E1001 12:54:36.270422 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.270385065 +0000 UTC m=+148.615502591 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.270549 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.270630 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.270684 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:36 crc kubenswrapper[4851]: E1001 12:54:36.270801 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.270829 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:36 crc kubenswrapper[4851]: E1001 12:54:36.270838 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:54:36 crc kubenswrapper[4851]: E1001 12:54:36.270940 4851 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:54:36 crc kubenswrapper[4851]: E1001 12:54:36.270963 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:54:36 crc kubenswrapper[4851]: E1001 12:54:36.271004 4851 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:54:36 crc kubenswrapper[4851]: E1001 12:54:36.271026 4851 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:54:36 crc kubenswrapper[4851]: E1001 12:54:36.271105 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.271030093 +0000 UTC m=+148.616147639 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:54:36 crc kubenswrapper[4851]: E1001 12:54:36.270950 4851 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:54:36 crc kubenswrapper[4851]: E1001 12:54:36.271147 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.271128006 +0000 UTC m=+148.616245532 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:54:36 crc kubenswrapper[4851]: E1001 12:54:36.270834 4851 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:54:36 crc kubenswrapper[4851]: E1001 12:54:36.271180 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.271164037 +0000 UTC m=+148.616281653 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:54:36 crc kubenswrapper[4851]: E1001 12:54:36.271221 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.271204008 +0000 UTC m=+148.616321614 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.292280 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.292328 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.292344 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.292366 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.292383 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:36Z","lastTransitionTime":"2025-10-01T12:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.327946 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:36 crc kubenswrapper[4851]: E1001 12:54:36.328176 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.395989 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.396051 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.396068 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.396092 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.396109 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:36Z","lastTransitionTime":"2025-10-01T12:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.499641 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.499706 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.499729 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.499760 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.499780 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:36Z","lastTransitionTime":"2025-10-01T12:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.602854 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.602936 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.602957 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.602985 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.603006 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:36Z","lastTransitionTime":"2025-10-01T12:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.706008 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.706061 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.706078 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.706101 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.706118 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:36Z","lastTransitionTime":"2025-10-01T12:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.810873 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.810942 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.810970 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.811002 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.811024 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:36Z","lastTransitionTime":"2025-10-01T12:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.913648 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.913722 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.913745 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.913770 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:36 crc kubenswrapper[4851]: I1001 12:54:36.913789 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:36Z","lastTransitionTime":"2025-10-01T12:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.017401 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.017450 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.017466 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.017488 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.017533 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:37Z","lastTransitionTime":"2025-10-01T12:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.120491 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.120576 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.120596 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.120618 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.120635 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:37Z","lastTransitionTime":"2025-10-01T12:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.223130 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.223209 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.223242 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.223275 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.223301 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:37Z","lastTransitionTime":"2025-10-01T12:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.326672 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.326728 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.326747 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.326772 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.326789 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:37Z","lastTransitionTime":"2025-10-01T12:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.327319 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.327428 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.327323 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:37 crc kubenswrapper[4851]: E1001 12:54:37.327475 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:37 crc kubenswrapper[4851]: E1001 12:54:37.327637 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:37 crc kubenswrapper[4851]: E1001 12:54:37.327791 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.429531 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.429584 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.429602 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.429626 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.429644 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:37Z","lastTransitionTime":"2025-10-01T12:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.532346 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.532388 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.532399 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.532417 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.532430 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:37Z","lastTransitionTime":"2025-10-01T12:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.634848 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.634899 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.634912 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.634929 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.634942 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:37Z","lastTransitionTime":"2025-10-01T12:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.737477 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.737601 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.737620 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.737644 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.737662 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:37Z","lastTransitionTime":"2025-10-01T12:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.842199 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.842254 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.842275 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.842301 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.842322 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:37Z","lastTransitionTime":"2025-10-01T12:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.951078 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.951148 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.951187 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.951220 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:37 crc kubenswrapper[4851]: I1001 12:54:37.951244 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:37Z","lastTransitionTime":"2025-10-01T12:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.054328 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.054368 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.054380 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.054397 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.054408 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:38Z","lastTransitionTime":"2025-10-01T12:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.157414 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.157453 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.157464 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.157481 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.157493 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:38Z","lastTransitionTime":"2025-10-01T12:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.261124 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.261187 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.261204 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.261230 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.261247 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:38Z","lastTransitionTime":"2025-10-01T12:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.328013 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:38 crc kubenswrapper[4851]: E1001 12:54:38.328253 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.363825 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.363884 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.363902 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.363925 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.363943 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:38Z","lastTransitionTime":"2025-10-01T12:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.467621 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.467686 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.467709 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.467739 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.467757 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:38Z","lastTransitionTime":"2025-10-01T12:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.571244 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.571711 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.571784 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.571825 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.571860 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:38Z","lastTransitionTime":"2025-10-01T12:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.675472 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.675582 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.675601 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.675624 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.675641 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:38Z","lastTransitionTime":"2025-10-01T12:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.778554 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.778620 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.778643 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.778673 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.778693 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:38Z","lastTransitionTime":"2025-10-01T12:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.881496 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.881613 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.881637 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.881666 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.881688 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:38Z","lastTransitionTime":"2025-10-01T12:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.984779 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.984838 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.984856 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.984878 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:38 crc kubenswrapper[4851]: I1001 12:54:38.984895 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:38Z","lastTransitionTime":"2025-10-01T12:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.088613 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.088712 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.088738 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.088771 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.088799 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:39Z","lastTransitionTime":"2025-10-01T12:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.191936 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.192044 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.192067 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.192109 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.192126 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:39Z","lastTransitionTime":"2025-10-01T12:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.295122 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.295206 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.295235 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.295264 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.295286 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:39Z","lastTransitionTime":"2025-10-01T12:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.328155 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.328261 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.328334 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:39 crc kubenswrapper[4851]: E1001 12:54:39.328352 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:39 crc kubenswrapper[4851]: E1001 12:54:39.328467 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:39 crc kubenswrapper[4851]: E1001 12:54:39.328957 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.397956 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.398051 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.398070 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.398096 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.398113 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:39Z","lastTransitionTime":"2025-10-01T12:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.501261 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.501330 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.501353 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.501382 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.501406 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:39Z","lastTransitionTime":"2025-10-01T12:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.604424 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.604471 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.604479 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.604494 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.604531 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:39Z","lastTransitionTime":"2025-10-01T12:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.708439 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.708496 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.708533 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.708555 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.708572 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:39Z","lastTransitionTime":"2025-10-01T12:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.810909 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.810983 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.811001 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.811027 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.811049 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:39Z","lastTransitionTime":"2025-10-01T12:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.914547 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.914619 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.914637 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.914664 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:39 crc kubenswrapper[4851]: I1001 12:54:39.914683 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:39Z","lastTransitionTime":"2025-10-01T12:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.017050 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.017118 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.017135 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.017160 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.017178 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:40Z","lastTransitionTime":"2025-10-01T12:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.120064 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.120158 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.120185 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.120215 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.120236 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:40Z","lastTransitionTime":"2025-10-01T12:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.223533 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.223593 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.223622 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.223655 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.223678 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:40Z","lastTransitionTime":"2025-10-01T12:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.326912 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.326971 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.326987 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.327019 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.327044 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:40Z","lastTransitionTime":"2025-10-01T12:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.327431 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:40 crc kubenswrapper[4851]: E1001 12:54:40.327626 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.349708 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.349787 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.349807 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.349834 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.349856 4851 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:54:40Z","lastTransitionTime":"2025-10-01T12:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.428770 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np"] Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.429363 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.433164 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.433410 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.433574 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.433820 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.481132 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=41.481105932 podStartE2EDuration="41.481105932s" podCreationTimestamp="2025-10-01 12:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:40.456847916 +0000 UTC m=+88.801965432" watchObservedRunningTime="2025-10-01 12:54:40.481105932 +0000 UTC m=+88.826223458" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.481397 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=67.481387739 podStartE2EDuration="1m7.481387739s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:40.481102691 +0000 UTC m=+88.826220187" watchObservedRunningTime="2025-10-01 12:54:40.481387739 +0000 UTC m=+88.826505255" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.518345 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/edd23913-1762-4ff2-bd3a-683198d55e2c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.518471 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/edd23913-1762-4ff2-bd3a-683198d55e2c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.518545 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edd23913-1762-4ff2-bd3a-683198d55e2c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.518631 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/edd23913-1762-4ff2-bd3a-683198d55e2c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.518746 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edd23913-1762-4ff2-bd3a-683198d55e2c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.541028 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rc8gl" podStartSLOduration=67.540995943 podStartE2EDuration="1m7.540995943s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:40.540545401 +0000 UTC m=+88.885662897" watchObservedRunningTime="2025-10-01 12:54:40.540995943 +0000 UTC m=+88.886113459" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.596791 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=20.596772799 podStartE2EDuration="20.596772799s" podCreationTimestamp="2025-10-01 12:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:40.575441386 +0000 UTC m=+88.920558882" watchObservedRunningTime="2025-10-01 12:54:40.596772799 +0000 UTC m=+88.941890295" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.619857 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edd23913-1762-4ff2-bd3a-683198d55e2c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.619942 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/edd23913-1762-4ff2-bd3a-683198d55e2c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.620017 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/edd23913-1762-4ff2-bd3a-683198d55e2c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.620056 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edd23913-1762-4ff2-bd3a-683198d55e2c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.620097 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/edd23913-1762-4ff2-bd3a-683198d55e2c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.620190 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/edd23913-1762-4ff2-bd3a-683198d55e2c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.620244 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/edd23913-1762-4ff2-bd3a-683198d55e2c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.621005 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/edd23913-1762-4ff2-bd3a-683198d55e2c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.626547 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edd23913-1762-4ff2-bd3a-683198d55e2c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.651098 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edd23913-1762-4ff2-bd3a-683198d55e2c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nn4np\" (UID: \"edd23913-1762-4ff2-bd3a-683198d55e2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.658795 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5q84l" podStartSLOduration=67.658762 podStartE2EDuration="1m7.658762s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:40.638334713 +0000 UTC m=+88.983452269" watchObservedRunningTime="2025-10-01 12:54:40.658762 +0000 UTC m=+89.003879536" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.659790 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-brwh5" podStartSLOduration=66.659775799 podStartE2EDuration="1m6.659775799s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:40.658025109 +0000 UTC m=+89.003142615" watchObservedRunningTime="2025-10-01 12:54:40.659775799 +0000 UTC m=+89.004893325" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.675067 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podStartSLOduration=67.67504363 podStartE2EDuration="1m7.67504363s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:40.673561438 +0000 UTC m=+89.018679004" watchObservedRunningTime="2025-10-01 12:54:40.67504363 +0000 UTC m=+89.020161166" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.722236 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=66.722219403 podStartE2EDuration="1m6.722219403s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:40.693670596 +0000 UTC m=+89.038788182" watchObservedRunningTime="2025-10-01 12:54:40.722219403 +0000 UTC m=+89.067336889" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.722448 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=10.722434359 podStartE2EDuration="10.722434359s" podCreationTimestamp="2025-10-01 12:54:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:40.722438429 +0000 UTC m=+89.067555955" watchObservedRunningTime="2025-10-01 12:54:40.722434359 +0000 UTC m=+89.067551845" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.750986 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.829736 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-t5vvf" podStartSLOduration=66.829710229 podStartE2EDuration="1m6.829710229s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:40.809960301 +0000 UTC m=+89.155077867" watchObservedRunningTime="2025-10-01 12:54:40.829710229 +0000 UTC m=+89.174827745" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.860494 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7qmx" podStartSLOduration=66.860471348 podStartE2EDuration="1m6.860471348s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:40.860062196 +0000 UTC m=+89.205179682" watchObservedRunningTime="2025-10-01 12:54:40.860471348 +0000 UTC m=+89.205588844" Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.958038 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" event={"ID":"edd23913-1762-4ff2-bd3a-683198d55e2c","Type":"ContainerStarted","Data":"b088566c2bca3c9dfa0747658df593b6700d0f6688b9b42747ac946237a407f3"} Oct 01 12:54:40 crc kubenswrapper[4851]: I1001 12:54:40.958103 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" event={"ID":"edd23913-1762-4ff2-bd3a-683198d55e2c","Type":"ContainerStarted","Data":"bc00fc2d87def09bbd5267b98b1befc6efb930ddfaf68de4565101c862533d06"} Oct 01 12:54:41 crc kubenswrapper[4851]: I1001 12:54:41.327776 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:41 crc kubenswrapper[4851]: I1001 12:54:41.327857 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:41 crc kubenswrapper[4851]: I1001 12:54:41.327896 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:41 crc kubenswrapper[4851]: E1001 12:54:41.328052 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:41 crc kubenswrapper[4851]: E1001 12:54:41.328150 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:41 crc kubenswrapper[4851]: E1001 12:54:41.328311 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:42 crc kubenswrapper[4851]: I1001 12:54:42.328027 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:42 crc kubenswrapper[4851]: E1001 12:54:42.329114 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:43 crc kubenswrapper[4851]: I1001 12:54:43.327484 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:43 crc kubenswrapper[4851]: E1001 12:54:43.327619 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:43 crc kubenswrapper[4851]: I1001 12:54:43.327671 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:43 crc kubenswrapper[4851]: E1001 12:54:43.327818 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:43 crc kubenswrapper[4851]: I1001 12:54:43.327680 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:43 crc kubenswrapper[4851]: E1001 12:54:43.328212 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:44 crc kubenswrapper[4851]: I1001 12:54:44.327700 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:44 crc kubenswrapper[4851]: E1001 12:54:44.327872 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:45 crc kubenswrapper[4851]: I1001 12:54:45.328316 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:45 crc kubenswrapper[4851]: I1001 12:54:45.328445 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:45 crc kubenswrapper[4851]: I1001 12:54:45.328654 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:45 crc kubenswrapper[4851]: E1001 12:54:45.328860 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:45 crc kubenswrapper[4851]: E1001 12:54:45.328951 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:45 crc kubenswrapper[4851]: E1001 12:54:45.329129 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:46 crc kubenswrapper[4851]: I1001 12:54:46.327989 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:46 crc kubenswrapper[4851]: E1001 12:54:46.328179 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:47 crc kubenswrapper[4851]: I1001 12:54:47.327842 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:47 crc kubenswrapper[4851]: I1001 12:54:47.327866 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:47 crc kubenswrapper[4851]: E1001 12:54:47.328040 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:47 crc kubenswrapper[4851]: I1001 12:54:47.327874 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:47 crc kubenswrapper[4851]: E1001 12:54:47.328173 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:47 crc kubenswrapper[4851]: E1001 12:54:47.328370 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:48 crc kubenswrapper[4851]: I1001 12:54:48.329290 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:48 crc kubenswrapper[4851]: I1001 12:54:48.329879 4851 scope.go:117] "RemoveContainer" containerID="a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4" Oct 01 12:54:48 crc kubenswrapper[4851]: E1001 12:54:48.329933 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:48 crc kubenswrapper[4851]: E1001 12:54:48.330134 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" Oct 01 12:54:49 crc kubenswrapper[4851]: I1001 12:54:49.327953 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:49 crc kubenswrapper[4851]: I1001 12:54:49.327963 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:49 crc kubenswrapper[4851]: E1001 12:54:49.328183 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:49 crc kubenswrapper[4851]: E1001 12:54:49.328316 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:49 crc kubenswrapper[4851]: I1001 12:54:49.327987 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:49 crc kubenswrapper[4851]: E1001 12:54:49.328634 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:50 crc kubenswrapper[4851]: I1001 12:54:50.327972 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:50 crc kubenswrapper[4851]: E1001 12:54:50.328313 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:51 crc kubenswrapper[4851]: I1001 12:54:51.328424 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:51 crc kubenswrapper[4851]: I1001 12:54:51.328521 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:51 crc kubenswrapper[4851]: I1001 12:54:51.328424 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:51 crc kubenswrapper[4851]: E1001 12:54:51.328674 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:51 crc kubenswrapper[4851]: E1001 12:54:51.328753 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:51 crc kubenswrapper[4851]: E1001 12:54:51.328899 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:52 crc kubenswrapper[4851]: I1001 12:54:52.327683 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:52 crc kubenswrapper[4851]: E1001 12:54:52.329592 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:52 crc kubenswrapper[4851]: I1001 12:54:52.657074 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs\") pod \"network-metrics-daemon-75dqp\" (UID: \"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\") " pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:52 crc kubenswrapper[4851]: E1001 12:54:52.657301 4851 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:54:52 crc kubenswrapper[4851]: E1001 12:54:52.657411 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs podName:8a8fe88f-cdfe-4415-98a0-4cc8f018a962 nodeName:}" failed. No retries permitted until 2025-10-01 12:55:56.657378463 +0000 UTC m=+165.002495989 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs") pod "network-metrics-daemon-75dqp" (UID: "8a8fe88f-cdfe-4415-98a0-4cc8f018a962") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:54:53 crc kubenswrapper[4851]: I1001 12:54:53.327892 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:53 crc kubenswrapper[4851]: I1001 12:54:53.327892 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:53 crc kubenswrapper[4851]: E1001 12:54:53.328070 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:53 crc kubenswrapper[4851]: I1001 12:54:53.327922 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:53 crc kubenswrapper[4851]: E1001 12:54:53.328253 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:53 crc kubenswrapper[4851]: E1001 12:54:53.328431 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:54 crc kubenswrapper[4851]: I1001 12:54:54.328487 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:54 crc kubenswrapper[4851]: E1001 12:54:54.328763 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:55 crc kubenswrapper[4851]: I1001 12:54:55.328413 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:55 crc kubenswrapper[4851]: I1001 12:54:55.328420 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:55 crc kubenswrapper[4851]: I1001 12:54:55.328450 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:55 crc kubenswrapper[4851]: E1001 12:54:55.328669 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:55 crc kubenswrapper[4851]: E1001 12:54:55.328795 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:55 crc kubenswrapper[4851]: E1001 12:54:55.328926 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:56 crc kubenswrapper[4851]: I1001 12:54:56.328725 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:56 crc kubenswrapper[4851]: E1001 12:54:56.328899 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:57 crc kubenswrapper[4851]: I1001 12:54:57.327800 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:57 crc kubenswrapper[4851]: I1001 12:54:57.327837 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:57 crc kubenswrapper[4851]: I1001 12:54:57.327828 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:57 crc kubenswrapper[4851]: E1001 12:54:57.328009 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:57 crc kubenswrapper[4851]: E1001 12:54:57.328157 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:57 crc kubenswrapper[4851]: E1001 12:54:57.328247 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:54:58 crc kubenswrapper[4851]: I1001 12:54:58.328039 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:54:58 crc kubenswrapper[4851]: E1001 12:54:58.328194 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:54:59 crc kubenswrapper[4851]: I1001 12:54:59.327728 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:54:59 crc kubenswrapper[4851]: I1001 12:54:59.327757 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:54:59 crc kubenswrapper[4851]: I1001 12:54:59.327774 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:54:59 crc kubenswrapper[4851]: E1001 12:54:59.327886 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:54:59 crc kubenswrapper[4851]: E1001 12:54:59.328079 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:54:59 crc kubenswrapper[4851]: E1001 12:54:59.328199 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:00 crc kubenswrapper[4851]: I1001 12:55:00.328391 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:00 crc kubenswrapper[4851]: E1001 12:55:00.328648 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:01 crc kubenswrapper[4851]: I1001 12:55:01.328255 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:01 crc kubenswrapper[4851]: I1001 12:55:01.328373 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:01 crc kubenswrapper[4851]: I1001 12:55:01.328380 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:01 crc kubenswrapper[4851]: E1001 12:55:01.328926 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:55:01 crc kubenswrapper[4851]: E1001 12:55:01.329025 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:01 crc kubenswrapper[4851]: E1001 12:55:01.329138 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:55:01 crc kubenswrapper[4851]: I1001 12:55:01.329535 4851 scope.go:117] "RemoveContainer" containerID="a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4" Oct 01 12:55:01 crc kubenswrapper[4851]: E1001 12:55:01.329779 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-s78wn_openshift-ovn-kubernetes(eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" Oct 01 12:55:02 crc kubenswrapper[4851]: I1001 12:55:02.327807 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:02 crc kubenswrapper[4851]: E1001 12:55:02.330227 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:03 crc kubenswrapper[4851]: I1001 12:55:03.328282 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:03 crc kubenswrapper[4851]: I1001 12:55:03.328357 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:03 crc kubenswrapper[4851]: I1001 12:55:03.328387 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:03 crc kubenswrapper[4851]: E1001 12:55:03.328470 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:55:03 crc kubenswrapper[4851]: E1001 12:55:03.328697 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:03 crc kubenswrapper[4851]: E1001 12:55:03.329000 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:55:04 crc kubenswrapper[4851]: I1001 12:55:04.328051 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:04 crc kubenswrapper[4851]: E1001 12:55:04.328344 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:05 crc kubenswrapper[4851]: I1001 12:55:05.327320 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:05 crc kubenswrapper[4851]: I1001 12:55:05.327319 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:05 crc kubenswrapper[4851]: I1001 12:55:05.327408 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:05 crc kubenswrapper[4851]: E1001 12:55:05.327484 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:55:05 crc kubenswrapper[4851]: E1001 12:55:05.327619 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:05 crc kubenswrapper[4851]: E1001 12:55:05.327791 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:55:06 crc kubenswrapper[4851]: I1001 12:55:06.327622 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:06 crc kubenswrapper[4851]: E1001 12:55:06.327852 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:07 crc kubenswrapper[4851]: I1001 12:55:07.328023 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:07 crc kubenswrapper[4851]: I1001 12:55:07.328141 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:07 crc kubenswrapper[4851]: E1001 12:55:07.328234 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:07 crc kubenswrapper[4851]: E1001 12:55:07.328375 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:55:07 crc kubenswrapper[4851]: I1001 12:55:07.328756 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:07 crc kubenswrapper[4851]: E1001 12:55:07.328904 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:55:08 crc kubenswrapper[4851]: I1001 12:55:08.328197 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:08 crc kubenswrapper[4851]: E1001 12:55:08.328450 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:09 crc kubenswrapper[4851]: I1001 12:55:09.066788 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t5vvf_f68f162a-4e04-41d2-8197-95bac24aad23/kube-multus/1.log" Oct 01 12:55:09 crc kubenswrapper[4851]: I1001 12:55:09.067912 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t5vvf_f68f162a-4e04-41d2-8197-95bac24aad23/kube-multus/0.log" Oct 01 12:55:09 crc kubenswrapper[4851]: I1001 12:55:09.068013 4851 generic.go:334] "Generic (PLEG): container finished" podID="f68f162a-4e04-41d2-8197-95bac24aad23" containerID="9be4185c6a9a5112475d68907478d108366f2f6bff109e7d802e8b720953ffd9" exitCode=1 Oct 01 12:55:09 crc kubenswrapper[4851]: I1001 12:55:09.068083 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t5vvf" event={"ID":"f68f162a-4e04-41d2-8197-95bac24aad23","Type":"ContainerDied","Data":"9be4185c6a9a5112475d68907478d108366f2f6bff109e7d802e8b720953ffd9"} Oct 01 12:55:09 crc kubenswrapper[4851]: I1001 12:55:09.068137 4851 scope.go:117] "RemoveContainer" containerID="929c2be2516002275dd31022ebe717b9a479bca89cc1ee09f9666d06a1071965" Oct 01 12:55:09 crc kubenswrapper[4851]: I1001 12:55:09.068741 4851 scope.go:117] "RemoveContainer" containerID="9be4185c6a9a5112475d68907478d108366f2f6bff109e7d802e8b720953ffd9" Oct 01 12:55:09 crc kubenswrapper[4851]: E1001 12:55:09.069018 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-t5vvf_openshift-multus(f68f162a-4e04-41d2-8197-95bac24aad23)\"" pod="openshift-multus/multus-t5vvf" podUID="f68f162a-4e04-41d2-8197-95bac24aad23" Oct 01 12:55:09 crc kubenswrapper[4851]: I1001 12:55:09.102390 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nn4np" podStartSLOduration=96.10236102 podStartE2EDuration="1m36.10236102s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:40.977672069 +0000 UTC m=+89.322789635" watchObservedRunningTime="2025-10-01 12:55:09.10236102 +0000 UTC m=+117.447478546" Oct 01 12:55:09 crc kubenswrapper[4851]: I1001 12:55:09.327561 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:09 crc kubenswrapper[4851]: I1001 12:55:09.327630 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:09 crc kubenswrapper[4851]: I1001 12:55:09.327574 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:09 crc kubenswrapper[4851]: E1001 12:55:09.327738 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:55:09 crc kubenswrapper[4851]: E1001 12:55:09.327932 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:09 crc kubenswrapper[4851]: E1001 12:55:09.328121 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:55:10 crc kubenswrapper[4851]: I1001 12:55:10.076353 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t5vvf_f68f162a-4e04-41d2-8197-95bac24aad23/kube-multus/1.log" Oct 01 12:55:10 crc kubenswrapper[4851]: I1001 12:55:10.328490 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:10 crc kubenswrapper[4851]: E1001 12:55:10.328747 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:11 crc kubenswrapper[4851]: I1001 12:55:11.328107 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:11 crc kubenswrapper[4851]: I1001 12:55:11.328195 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:11 crc kubenswrapper[4851]: I1001 12:55:11.328203 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:11 crc kubenswrapper[4851]: E1001 12:55:11.328340 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:11 crc kubenswrapper[4851]: E1001 12:55:11.328533 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:55:11 crc kubenswrapper[4851]: E1001 12:55:11.328680 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:55:12 crc kubenswrapper[4851]: E1001 12:55:12.322553 4851 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 01 12:55:12 crc kubenswrapper[4851]: I1001 12:55:12.327419 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:12 crc kubenswrapper[4851]: E1001 12:55:12.329581 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:12 crc kubenswrapper[4851]: I1001 12:55:12.330073 4851 scope.go:117] "RemoveContainer" containerID="a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4" Oct 01 12:55:12 crc kubenswrapper[4851]: E1001 12:55:12.437427 4851 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 12:55:13 crc kubenswrapper[4851]: I1001 12:55:13.089469 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/3.log" Oct 01 12:55:13 crc kubenswrapper[4851]: I1001 12:55:13.092221 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerStarted","Data":"726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5"} Oct 01 12:55:13 crc kubenswrapper[4851]: I1001 12:55:13.092649 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:55:13 crc kubenswrapper[4851]: I1001 12:55:13.132850 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podStartSLOduration=99.132833075 podStartE2EDuration="1m39.132833075s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:13.130369895 +0000 UTC m=+121.475487421" watchObservedRunningTime="2025-10-01 12:55:13.132833075 +0000 UTC m=+121.477950581" Oct 01 12:55:13 crc kubenswrapper[4851]: I1001 12:55:13.327659 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:13 crc kubenswrapper[4851]: I1001 12:55:13.327715 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:13 crc kubenswrapper[4851]: I1001 12:55:13.327785 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:13 crc kubenswrapper[4851]: E1001 12:55:13.327889 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:55:13 crc kubenswrapper[4851]: E1001 12:55:13.327998 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:55:13 crc kubenswrapper[4851]: E1001 12:55:13.328075 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:13 crc kubenswrapper[4851]: I1001 12:55:13.433598 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-75dqp"] Oct 01 12:55:13 crc kubenswrapper[4851]: I1001 12:55:13.433731 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:13 crc kubenswrapper[4851]: E1001 12:55:13.433834 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:15 crc kubenswrapper[4851]: I1001 12:55:15.327667 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:15 crc kubenswrapper[4851]: E1001 12:55:15.328027 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:55:15 crc kubenswrapper[4851]: I1001 12:55:15.327710 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:15 crc kubenswrapper[4851]: E1001 12:55:15.328098 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:15 crc kubenswrapper[4851]: I1001 12:55:15.327766 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:15 crc kubenswrapper[4851]: E1001 12:55:15.328156 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:15 crc kubenswrapper[4851]: I1001 12:55:15.327669 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:15 crc kubenswrapper[4851]: E1001 12:55:15.328216 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:55:17 crc kubenswrapper[4851]: I1001 12:55:17.327976 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:17 crc kubenswrapper[4851]: I1001 12:55:17.328048 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:17 crc kubenswrapper[4851]: I1001 12:55:17.328029 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:17 crc kubenswrapper[4851]: I1001 12:55:17.327997 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:17 crc kubenswrapper[4851]: E1001 12:55:17.328175 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:55:17 crc kubenswrapper[4851]: E1001 12:55:17.328311 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:55:17 crc kubenswrapper[4851]: E1001 12:55:17.328445 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:17 crc kubenswrapper[4851]: E1001 12:55:17.328568 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:17 crc kubenswrapper[4851]: E1001 12:55:17.439202 4851 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 12:55:19 crc kubenswrapper[4851]: I1001 12:55:19.327387 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:19 crc kubenswrapper[4851]: I1001 12:55:19.327428 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:19 crc kubenswrapper[4851]: I1001 12:55:19.327547 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:19 crc kubenswrapper[4851]: I1001 12:55:19.327780 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:19 crc kubenswrapper[4851]: E1001 12:55:19.328089 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:55:19 crc kubenswrapper[4851]: E1001 12:55:19.328257 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:55:19 crc kubenswrapper[4851]: E1001 12:55:19.328432 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:19 crc kubenswrapper[4851]: E1001 12:55:19.328594 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:21 crc kubenswrapper[4851]: I1001 12:55:21.328142 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:21 crc kubenswrapper[4851]: I1001 12:55:21.328183 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:21 crc kubenswrapper[4851]: I1001 12:55:21.328201 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:21 crc kubenswrapper[4851]: I1001 12:55:21.328149 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:21 crc kubenswrapper[4851]: E1001 12:55:21.328355 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:55:21 crc kubenswrapper[4851]: E1001 12:55:21.328438 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:21 crc kubenswrapper[4851]: E1001 12:55:21.328593 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:55:21 crc kubenswrapper[4851]: E1001 12:55:21.328765 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:22 crc kubenswrapper[4851]: E1001 12:55:22.441161 4851 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 12:55:23 crc kubenswrapper[4851]: I1001 12:55:23.327990 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:23 crc kubenswrapper[4851]: I1001 12:55:23.328006 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:23 crc kubenswrapper[4851]: I1001 12:55:23.328218 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:23 crc kubenswrapper[4851]: E1001 12:55:23.328276 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:55:23 crc kubenswrapper[4851]: I1001 12:55:23.328353 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:23 crc kubenswrapper[4851]: E1001 12:55:23.328431 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:23 crc kubenswrapper[4851]: E1001 12:55:23.328496 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:23 crc kubenswrapper[4851]: E1001 12:55:23.328653 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:55:24 crc kubenswrapper[4851]: I1001 12:55:24.328570 4851 scope.go:117] "RemoveContainer" containerID="9be4185c6a9a5112475d68907478d108366f2f6bff109e7d802e8b720953ffd9" Oct 01 12:55:25 crc kubenswrapper[4851]: I1001 12:55:25.138742 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t5vvf_f68f162a-4e04-41d2-8197-95bac24aad23/kube-multus/1.log" Oct 01 12:55:25 crc kubenswrapper[4851]: I1001 12:55:25.139139 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t5vvf" event={"ID":"f68f162a-4e04-41d2-8197-95bac24aad23","Type":"ContainerStarted","Data":"67fb38d621be1a818ac5d4d426ab4652a41b195832d2e760e8388c776796f58c"} Oct 01 12:55:25 crc kubenswrapper[4851]: I1001 12:55:25.328329 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:25 crc kubenswrapper[4851]: I1001 12:55:25.328461 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:25 crc kubenswrapper[4851]: E1001 12:55:25.328563 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:25 crc kubenswrapper[4851]: I1001 12:55:25.328640 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:25 crc kubenswrapper[4851]: E1001 12:55:25.328859 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:55:25 crc kubenswrapper[4851]: I1001 12:55:25.328978 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:25 crc kubenswrapper[4851]: E1001 12:55:25.329173 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:55:25 crc kubenswrapper[4851]: E1001 12:55:25.329355 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:25 crc kubenswrapper[4851]: I1001 12:55:25.424495 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 12:55:27 crc kubenswrapper[4851]: I1001 12:55:27.328049 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:27 crc kubenswrapper[4851]: E1001 12:55:27.328233 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75dqp" podUID="8a8fe88f-cdfe-4415-98a0-4cc8f018a962" Oct 01 12:55:27 crc kubenswrapper[4851]: I1001 12:55:27.328490 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:27 crc kubenswrapper[4851]: E1001 12:55:27.328605 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:55:27 crc kubenswrapper[4851]: I1001 12:55:27.328788 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:27 crc kubenswrapper[4851]: E1001 12:55:27.328869 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:55:27 crc kubenswrapper[4851]: I1001 12:55:27.329038 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:27 crc kubenswrapper[4851]: E1001 12:55:27.329115 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:55:29 crc kubenswrapper[4851]: I1001 12:55:29.328144 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:29 crc kubenswrapper[4851]: I1001 12:55:29.328256 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:29 crc kubenswrapper[4851]: I1001 12:55:29.328184 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:29 crc kubenswrapper[4851]: I1001 12:55:29.328453 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:29 crc kubenswrapper[4851]: I1001 12:55:29.330883 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 01 12:55:29 crc kubenswrapper[4851]: I1001 12:55:29.331706 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 01 12:55:29 crc kubenswrapper[4851]: I1001 12:55:29.331750 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 01 12:55:29 crc kubenswrapper[4851]: I1001 12:55:29.332371 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 01 12:55:29 crc kubenswrapper[4851]: I1001 12:55:29.332894 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 01 12:55:29 crc kubenswrapper[4851]: I1001 12:55:29.333892 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.445344 4851 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.499949 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-v2v8j"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.501113 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.501296 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.502230 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k7rvb"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.502298 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.502988 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.506959 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.507846 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.516027 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6z8fq"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.516695 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.517201 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.524451 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.541521 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.541713 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.541768 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.541833 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.541926 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.542092 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.542573 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.551756 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.551899 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.552013 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.552164 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.552272 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.552422 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.552812 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.553243 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.553990 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.554157 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.554400 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.554514 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.554620 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.554668 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.554747 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.554823 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.556130 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.557463 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.557601 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-xjvqh"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.557977 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.558037 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.557987 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.558391 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.558730 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.558857 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.559032 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.561566 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.562640 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.563026 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.563249 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.563416 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.564032 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-68zzg"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.563936 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.564595 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.567532 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.570284 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.577903 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.577918 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.579398 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.596027 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-x5dht"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.596932 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qcsbq"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.597417 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vxwl6"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.597947 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.598842 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qzxpg"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.601820 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.602144 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.602247 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.601829 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.602563 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.602596 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.607646 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608190 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608219 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608240 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/327992e6-37f8-487b-915b-b633070aece3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608260 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c557af36-f061-43db-9a5b-e88e8df92493-node-pullsecrets\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608277 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e0769b5-2ba5-4464-b580-ec55796e5ea1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2qlz9\" (UID: \"2e0769b5-2ba5-4464-b580-ec55796e5ea1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608294 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/327992e6-37f8-487b-915b-b633070aece3-audit-policies\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608309 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsfvp\" (UniqueName: \"kubernetes.io/projected/327992e6-37f8-487b-915b-b633070aece3-kube-api-access-wsfvp\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608323 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c557af36-f061-43db-9a5b-e88e8df92493-audit-dir\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608345 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608359 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-console-config\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608378 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c557af36-f061-43db-9a5b-e88e8df92493-etcd-client\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608393 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-etcd-serving-ca\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608409 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608424 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608440 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-service-ca\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608455 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697-config\") pod \"machine-api-operator-5694c8668f-68zzg\" (UID: \"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608470 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608484 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/327992e6-37f8-487b-915b-b633070aece3-audit-dir\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608511 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-audit\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608527 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e0769b5-2ba5-4464-b580-ec55796e5ea1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2qlz9\" (UID: \"2e0769b5-2ba5-4464-b580-ec55796e5ea1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608541 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-client-ca\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608557 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608584 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-config\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608599 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkkqh\" (UniqueName: \"kubernetes.io/projected/b5b5efe6-729a-431f-b8cd-67562ec18593-kube-api-access-xkkqh\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608613 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7trkq\" (UniqueName: \"kubernetes.io/projected/c557af36-f061-43db-9a5b-e88e8df92493-kube-api-access-7trkq\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608627 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b5efe6-729a-431f-b8cd-67562ec18593-console-serving-cert\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608640 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-trusted-ca-bundle\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608657 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6538b69-30fc-4cc0-80da-62537b61f41f-serving-cert\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608671 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7fss\" (UniqueName: \"kubernetes.io/projected/f6538b69-30fc-4cc0-80da-62537b61f41f-kube-api-access-g7fss\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608686 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-audit-policies\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608702 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608717 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-trusted-ca-bundle\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608731 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608746 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/327992e6-37f8-487b-915b-b633070aece3-encryption-config\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608760 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608776 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-image-import-ca\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608792 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/327992e6-37f8-487b-915b-b633070aece3-etcd-client\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608814 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/327992e6-37f8-487b-915b-b633070aece3-serving-cert\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608829 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c557af36-f061-43db-9a5b-e88e8df92493-encryption-config\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608846 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c557af36-f061-43db-9a5b-e88e8df92493-serving-cert\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608860 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-oauth-serving-cert\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608874 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608890 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e0769b5-2ba5-4464-b580-ec55796e5ea1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2qlz9\" (UID: \"2e0769b5-2ba5-4464-b580-ec55796e5ea1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608906 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-config\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608921 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608936 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/327992e6-37f8-487b-915b-b633070aece3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608950 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4pkj\" (UniqueName: \"kubernetes.io/projected/0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697-kube-api-access-q4pkj\") pod \"machine-api-operator-5694c8668f-68zzg\" (UID: \"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608966 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86146cb0-0a54-4309-8345-565e2da39442-audit-dir\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608980 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krwjm\" (UniqueName: \"kubernetes.io/projected/86146cb0-0a54-4309-8345-565e2da39442-kube-api-access-krwjm\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.608994 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697-images\") pod \"machine-api-operator-5694c8668f-68zzg\" (UID: \"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.609009 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc5m4\" (UniqueName: \"kubernetes.io/projected/2e0769b5-2ba5-4464-b580-ec55796e5ea1-kube-api-access-lc5m4\") pod \"cluster-image-registry-operator-dc59b4c8b-2qlz9\" (UID: \"2e0769b5-2ba5-4464-b580-ec55796e5ea1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.609024 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-68zzg\" (UID: \"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.609038 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5b5efe6-729a-431f-b8cd-67562ec18593-console-oauth-config\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.609120 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.609856 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.610079 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.610214 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.610266 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.610397 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.612306 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.612357 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qcsbq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.612587 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.613070 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.613438 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bhp8c"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.613714 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.613916 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qzxpg" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.613940 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-qtlzx"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.613999 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.614038 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.614257 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bhp8c" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.614691 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.615562 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.615787 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.615942 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.615995 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.616066 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.616091 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.616166 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.616287 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.616819 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.623800 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.624296 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.624653 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6s9sc"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.625068 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.625353 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.625564 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.626006 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.626545 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.626946 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.627001 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.628787 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wmgkn"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.629282 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mvb7b"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.629706 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.629997 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.630227 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6ccbw"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.630913 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6ccbw" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.632470 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.632927 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.633400 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.634172 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.634572 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.635356 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.637521 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8dv98"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.638293 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8dv98" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.639912 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.640117 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.640270 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.640462 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.641176 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.641697 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.641867 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.645669 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.646495 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.646878 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.649562 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.649701 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.650229 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.650709 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.651974 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.652172 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.652356 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.652577 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.652742 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.652895 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.657277 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.657563 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.664593 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.666031 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-b76sz"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.666551 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.666572 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7hnt8"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.680655 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-b76sz" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.681470 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.683652 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.684883 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.688362 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.689132 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.689676 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.689938 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.690048 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.690140 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.693407 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.694024 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.697204 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.697591 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.698699 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.699275 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.699120 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.699588 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.701115 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.703524 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.703913 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.704029 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.704176 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.704295 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.706834 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.706954 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.713108 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.713287 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.713298 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.714278 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.714342 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.715293 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.715723 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-t26f4"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.715729 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697-images\") pod \"machine-api-operator-5694c8668f-68zzg\" (UID: \"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.715795 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86146cb0-0a54-4309-8345-565e2da39442-audit-dir\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.715820 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krwjm\" (UniqueName: \"kubernetes.io/projected/86146cb0-0a54-4309-8345-565e2da39442-kube-api-access-krwjm\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.715842 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc5m4\" (UniqueName: \"kubernetes.io/projected/2e0769b5-2ba5-4464-b580-ec55796e5ea1-kube-api-access-lc5m4\") pod \"cluster-image-registry-operator-dc59b4c8b-2qlz9\" (UID: \"2e0769b5-2ba5-4464-b580-ec55796e5ea1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.715865 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-68zzg\" (UID: \"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.715888 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5b5efe6-729a-431f-b8cd-67562ec18593-console-oauth-config\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.715920 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.715942 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.715970 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/327992e6-37f8-487b-915b-b633070aece3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.715998 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c557af36-f061-43db-9a5b-e88e8df92493-node-pullsecrets\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716028 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e0769b5-2ba5-4464-b580-ec55796e5ea1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2qlz9\" (UID: \"2e0769b5-2ba5-4464-b580-ec55796e5ea1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716063 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c557af36-f061-43db-9a5b-e88e8df92493-audit-dir\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716093 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/327992e6-37f8-487b-915b-b633070aece3-audit-policies\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716114 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsfvp\" (UniqueName: \"kubernetes.io/projected/327992e6-37f8-487b-915b-b633070aece3-kube-api-access-wsfvp\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716138 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716161 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-console-config\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716191 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c557af36-f061-43db-9a5b-e88e8df92493-etcd-client\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716212 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-etcd-serving-ca\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716236 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716260 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716281 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-service-ca\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716308 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697-config\") pod \"machine-api-operator-5694c8668f-68zzg\" (UID: \"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716343 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-audit\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716367 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e0769b5-2ba5-4464-b580-ec55796e5ea1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2qlz9\" (UID: \"2e0769b5-2ba5-4464-b580-ec55796e5ea1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716390 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-client-ca\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716413 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716435 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/327992e6-37f8-487b-915b-b633070aece3-audit-dir\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716458 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716514 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bhp8c"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716518 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-config\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716543 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkkqh\" (UniqueName: \"kubernetes.io/projected/b5b5efe6-729a-431f-b8cd-67562ec18593-kube-api-access-xkkqh\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716563 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-trusted-ca-bundle\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716587 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t26f4" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716586 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7trkq\" (UniqueName: \"kubernetes.io/projected/c557af36-f061-43db-9a5b-e88e8df92493-kube-api-access-7trkq\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716820 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b5efe6-729a-431f-b8cd-67562ec18593-console-serving-cert\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716838 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7fss\" (UniqueName: \"kubernetes.io/projected/f6538b69-30fc-4cc0-80da-62537b61f41f-kube-api-access-g7fss\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716857 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-audit-policies\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716879 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6538b69-30fc-4cc0-80da-62537b61f41f-serving-cert\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716894 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716912 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716929 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/327992e6-37f8-487b-915b-b633070aece3-encryption-config\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716944 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-trusted-ca-bundle\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716960 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716976 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-image-import-ca\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.716995 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/327992e6-37f8-487b-915b-b633070aece3-etcd-client\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.717010 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/327992e6-37f8-487b-915b-b633070aece3-serving-cert\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.717023 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c557af36-f061-43db-9a5b-e88e8df92493-encryption-config\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.717041 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c557af36-f061-43db-9a5b-e88e8df92493-serving-cert\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.717056 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-oauth-serving-cert\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.717072 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.717087 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e0769b5-2ba5-4464-b580-ec55796e5ea1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2qlz9\" (UID: \"2e0769b5-2ba5-4464-b580-ec55796e5ea1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.717105 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-config\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.717123 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/327992e6-37f8-487b-915b-b633070aece3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.717138 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4pkj\" (UniqueName: \"kubernetes.io/projected/0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697-kube-api-access-q4pkj\") pod \"machine-api-operator-5694c8668f-68zzg\" (UID: \"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.717156 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.717260 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-v2v8j"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.717743 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.720264 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k7rvb"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.720286 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xjvqh"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.720296 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.720989 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697-images\") pod \"machine-api-operator-5694c8668f-68zzg\" (UID: \"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.721288 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86146cb0-0a54-4309-8345-565e2da39442-audit-dir\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.721847 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-console-config\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.722054 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-etcd-serving-ca\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.722451 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c557af36-f061-43db-9a5b-e88e8df92493-node-pullsecrets\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.724911 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.724934 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vxwl6"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.725889 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.726984 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.726989 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.728486 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/327992e6-37f8-487b-915b-b633070aece3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.728726 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-config\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.728805 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-oauth-serving-cert\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.730359 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-client-ca\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.731033 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.731528 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-service-ca\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.732081 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697-config\") pod \"machine-api-operator-5694c8668f-68zzg\" (UID: \"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.732457 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-audit\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.732931 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/327992e6-37f8-487b-915b-b633070aece3-audit-policies\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.733091 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b5efe6-729a-431f-b8cd-67562ec18593-console-serving-cert\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.733515 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/327992e6-37f8-487b-915b-b633070aece3-serving-cert\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.733537 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-68zzg\" (UID: \"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.734263 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5b5efe6-729a-431f-b8cd-67562ec18593-console-oauth-config\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.734270 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e0769b5-2ba5-4464-b580-ec55796e5ea1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2qlz9\" (UID: \"2e0769b5-2ba5-4464-b580-ec55796e5ea1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.735057 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-config\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.735213 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c557af36-f061-43db-9a5b-e88e8df92493-audit-dir\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.735248 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/327992e6-37f8-487b-915b-b633070aece3-audit-dir\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.735878 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/327992e6-37f8-487b-915b-b633070aece3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.736043 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-trusted-ca-bundle\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.736468 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-audit-policies\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.736805 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c557af36-f061-43db-9a5b-e88e8df92493-encryption-config\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.736969 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.737022 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.737195 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.737210 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c557af36-f061-43db-9a5b-e88e8df92493-serving-cert\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.737603 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.737882 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.738403 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-trusted-ca-bundle\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.738772 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c557af36-f061-43db-9a5b-e88e8df92493-image-import-ca\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.739616 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.739816 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e0769b5-2ba5-4464-b580-ec55796e5ea1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2qlz9\" (UID: \"2e0769b5-2ba5-4464-b580-ec55796e5ea1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.740525 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.745652 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.746077 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6538b69-30fc-4cc0-80da-62537b61f41f-serving-cert\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.747305 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c557af36-f061-43db-9a5b-e88e8df92493-etcd-client\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.747458 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.747600 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.747974 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.748072 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.748209 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.748446 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.748568 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-x5dht"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.749122 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.749359 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/327992e6-37f8-487b-915b-b633070aece3-encryption-config\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.749633 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/327992e6-37f8-487b-915b-b633070aece3-etcd-client\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.755673 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6z8fq"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.755718 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.758268 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.758295 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.758308 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6ccbw"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.763038 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qzxpg"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.765728 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.767316 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mvb7b"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.767955 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.767964 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.768963 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t26f4"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.770600 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.773816 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qcsbq"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.775261 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.776363 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.778116 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-68zzg"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.779346 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.780395 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.781447 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6s9sc"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.782409 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.782731 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.784456 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-f6l2s"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.785074 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-f6l2s" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.786730 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wmgkn"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.787824 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.788826 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7hnt8"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.797229 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.799562 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8dv98"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.800613 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.801648 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-b76sz"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.802626 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.803877 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.807152 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gjvsn"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.809279 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gjvsn"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.809432 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.810279 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ffhmh"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.810886 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ffhmh" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.815855 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ffhmh"] Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.817927 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/24e14587-7a28-41eb-9184-668739c10654-default-certificate\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.817960 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07660b5-fafb-43c6-89a0-eb5a359eb314-config\") pod \"console-operator-58897d9998-mvb7b\" (UID: \"d07660b5-fafb-43c6-89a0-eb5a359eb314\") " pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.817987 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-778g7\" (UniqueName: \"kubernetes.io/projected/24e14587-7a28-41eb-9184-668739c10654-kube-api-access-778g7\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818004 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57l6k\" (UniqueName: \"kubernetes.io/projected/8002697d-726f-49e7-830c-c64083912932-kube-api-access-57l6k\") pod \"service-ca-9c57cc56f-8dv98\" (UID: \"8002697d-726f-49e7-830c-c64083912932\") " pod="openshift-service-ca/service-ca-9c57cc56f-8dv98" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818029 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d024a0-38bb-4a46-8b02-3f380b51d144-config\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818045 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/24e14587-7a28-41eb-9184-668739c10654-stats-auth\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818061 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d07660b5-fafb-43c6-89a0-eb5a359eb314-trusted-ca\") pod \"console-operator-58897d9998-mvb7b\" (UID: \"d07660b5-fafb-43c6-89a0-eb5a359eb314\") " pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818163 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpq97\" (UniqueName: \"kubernetes.io/projected/f3d024a0-38bb-4a46-8b02-3f380b51d144-kube-api-access-hpq97\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818198 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkzv4\" (UniqueName: \"kubernetes.io/projected/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-kube-api-access-dkzv4\") pod \"service-ca-operator-777779d784-lmpr6\" (UID: \"f541a9af-1ddc-4f57-b6e9-2b1c5baa9953\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818221 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e967d0be-0502-4857-9be0-1a6560362c34-tmpfs\") pod \"packageserver-d55dfcdfc-cqwct\" (UID: \"e967d0be-0502-4857-9be0-1a6560362c34\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818240 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f3d024a0-38bb-4a46-8b02-3f380b51d144-etcd-service-ca\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818255 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9318518d-b104-4804-86ed-0b53f1c92c44-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6p6zn\" (UID: \"9318518d-b104-4804-86ed-0b53f1c92c44\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818277 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5648b4bb-f782-4788-b87a-b484fbd1f375-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c9zt4\" (UID: \"5648b4bb-f782-4788-b87a-b484fbd1f375\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818293 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3d024a0-38bb-4a46-8b02-3f380b51d144-serving-cert\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818324 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23bc52e7-0e50-454f-87ee-b82b608ee34a-config-volume\") pod \"collect-profiles-29322045-6cg7w\" (UID: \"23bc52e7-0e50-454f-87ee-b82b608ee34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818346 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e967d0be-0502-4857-9be0-1a6560362c34-webhook-cert\") pod \"packageserver-d55dfcdfc-cqwct\" (UID: \"e967d0be-0502-4857-9be0-1a6560362c34\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818365 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24e14587-7a28-41eb-9184-668739c10654-metrics-certs\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818381 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f3d024a0-38bb-4a46-8b02-3f380b51d144-etcd-client\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818461 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23bc52e7-0e50-454f-87ee-b82b608ee34a-secret-volume\") pod \"collect-profiles-29322045-6cg7w\" (UID: \"23bc52e7-0e50-454f-87ee-b82b608ee34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818524 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7ff570b-9b52-4d2f-b030-40eca1804794-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bhp8c\" (UID: \"c7ff570b-9b52-4d2f-b030-40eca1804794\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bhp8c" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818562 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpnc7\" (UniqueName: \"kubernetes.io/projected/c7ff570b-9b52-4d2f-b030-40eca1804794-kube-api-access-mpnc7\") pod \"control-plane-machine-set-operator-78cbb6b69f-bhp8c\" (UID: \"c7ff570b-9b52-4d2f-b030-40eca1804794\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bhp8c" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818589 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9318518d-b104-4804-86ed-0b53f1c92c44-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6p6zn\" (UID: \"9318518d-b104-4804-86ed-0b53f1c92c44\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818624 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzxg8\" (UniqueName: \"kubernetes.io/projected/23bc52e7-0e50-454f-87ee-b82b608ee34a-kube-api-access-zzxg8\") pod \"collect-profiles-29322045-6cg7w\" (UID: \"23bc52e7-0e50-454f-87ee-b82b608ee34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818649 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5648b4bb-f782-4788-b87a-b484fbd1f375-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c9zt4\" (UID: \"5648b4bb-f782-4788-b87a-b484fbd1f375\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818681 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-config\") pod \"service-ca-operator-777779d784-lmpr6\" (UID: \"f541a9af-1ddc-4f57-b6e9-2b1c5baa9953\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818709 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5648b4bb-f782-4788-b87a-b484fbd1f375-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c9zt4\" (UID: \"5648b4bb-f782-4788-b87a-b484fbd1f375\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818731 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-serving-cert\") pod \"service-ca-operator-777779d784-lmpr6\" (UID: \"f541a9af-1ddc-4f57-b6e9-2b1c5baa9953\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818754 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs7gx\" (UniqueName: \"kubernetes.io/projected/9318518d-b104-4804-86ed-0b53f1c92c44-kube-api-access-gs7gx\") pod \"openshift-controller-manager-operator-756b6f6bc6-6p6zn\" (UID: \"9318518d-b104-4804-86ed-0b53f1c92c44\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818807 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8002697d-726f-49e7-830c-c64083912932-signing-key\") pod \"service-ca-9c57cc56f-8dv98\" (UID: \"8002697d-726f-49e7-830c-c64083912932\") " pod="openshift-service-ca/service-ca-9c57cc56f-8dv98" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818841 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24e14587-7a28-41eb-9184-668739c10654-service-ca-bundle\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818928 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d07660b5-fafb-43c6-89a0-eb5a359eb314-serving-cert\") pod \"console-operator-58897d9998-mvb7b\" (UID: \"d07660b5-fafb-43c6-89a0-eb5a359eb314\") " pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.818962 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f3d024a0-38bb-4a46-8b02-3f380b51d144-etcd-ca\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.819049 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e967d0be-0502-4857-9be0-1a6560362c34-apiservice-cert\") pod \"packageserver-d55dfcdfc-cqwct\" (UID: \"e967d0be-0502-4857-9be0-1a6560362c34\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.819069 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8002697d-726f-49e7-830c-c64083912932-signing-cabundle\") pod \"service-ca-9c57cc56f-8dv98\" (UID: \"8002697d-726f-49e7-830c-c64083912932\") " pod="openshift-service-ca/service-ca-9c57cc56f-8dv98" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.819223 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vntpt\" (UniqueName: \"kubernetes.io/projected/d07660b5-fafb-43c6-89a0-eb5a359eb314-kube-api-access-vntpt\") pod \"console-operator-58897d9998-mvb7b\" (UID: \"d07660b5-fafb-43c6-89a0-eb5a359eb314\") " pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.819276 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdxr5\" (UniqueName: \"kubernetes.io/projected/e967d0be-0502-4857-9be0-1a6560362c34-kube-api-access-zdxr5\") pod \"packageserver-d55dfcdfc-cqwct\" (UID: \"e967d0be-0502-4857-9be0-1a6560362c34\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.822963 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.842978 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.863379 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.883610 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.903567 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.920820 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d07660b5-fafb-43c6-89a0-eb5a359eb314-serving-cert\") pod \"console-operator-58897d9998-mvb7b\" (UID: \"d07660b5-fafb-43c6-89a0-eb5a359eb314\") " pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.920860 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f3d024a0-38bb-4a46-8b02-3f380b51d144-etcd-ca\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.920901 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8002697d-726f-49e7-830c-c64083912932-signing-cabundle\") pod \"service-ca-9c57cc56f-8dv98\" (UID: \"8002697d-726f-49e7-830c-c64083912932\") " pod="openshift-service-ca/service-ca-9c57cc56f-8dv98" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.920921 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e967d0be-0502-4857-9be0-1a6560362c34-apiservice-cert\") pod \"packageserver-d55dfcdfc-cqwct\" (UID: \"e967d0be-0502-4857-9be0-1a6560362c34\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.920939 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vntpt\" (UniqueName: \"kubernetes.io/projected/d07660b5-fafb-43c6-89a0-eb5a359eb314-kube-api-access-vntpt\") pod \"console-operator-58897d9998-mvb7b\" (UID: \"d07660b5-fafb-43c6-89a0-eb5a359eb314\") " pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.920956 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdxr5\" (UniqueName: \"kubernetes.io/projected/e967d0be-0502-4857-9be0-1a6560362c34-kube-api-access-zdxr5\") pod \"packageserver-d55dfcdfc-cqwct\" (UID: \"e967d0be-0502-4857-9be0-1a6560362c34\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.920997 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/24e14587-7a28-41eb-9184-668739c10654-default-certificate\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921014 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07660b5-fafb-43c6-89a0-eb5a359eb314-config\") pod \"console-operator-58897d9998-mvb7b\" (UID: \"d07660b5-fafb-43c6-89a0-eb5a359eb314\") " pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921035 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57l6k\" (UniqueName: \"kubernetes.io/projected/8002697d-726f-49e7-830c-c64083912932-kube-api-access-57l6k\") pod \"service-ca-9c57cc56f-8dv98\" (UID: \"8002697d-726f-49e7-830c-c64083912932\") " pod="openshift-service-ca/service-ca-9c57cc56f-8dv98" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921060 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-778g7\" (UniqueName: \"kubernetes.io/projected/24e14587-7a28-41eb-9184-668739c10654-kube-api-access-778g7\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921077 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d024a0-38bb-4a46-8b02-3f380b51d144-config\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921091 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d07660b5-fafb-43c6-89a0-eb5a359eb314-trusted-ca\") pod \"console-operator-58897d9998-mvb7b\" (UID: \"d07660b5-fafb-43c6-89a0-eb5a359eb314\") " pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921107 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/24e14587-7a28-41eb-9184-668739c10654-stats-auth\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921122 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpq97\" (UniqueName: \"kubernetes.io/projected/f3d024a0-38bb-4a46-8b02-3f380b51d144-kube-api-access-hpq97\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921138 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkzv4\" (UniqueName: \"kubernetes.io/projected/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-kube-api-access-dkzv4\") pod \"service-ca-operator-777779d784-lmpr6\" (UID: \"f541a9af-1ddc-4f57-b6e9-2b1c5baa9953\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921154 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e967d0be-0502-4857-9be0-1a6560362c34-tmpfs\") pod \"packageserver-d55dfcdfc-cqwct\" (UID: \"e967d0be-0502-4857-9be0-1a6560362c34\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921170 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9318518d-b104-4804-86ed-0b53f1c92c44-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6p6zn\" (UID: \"9318518d-b104-4804-86ed-0b53f1c92c44\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921189 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5648b4bb-f782-4788-b87a-b484fbd1f375-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c9zt4\" (UID: \"5648b4bb-f782-4788-b87a-b484fbd1f375\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921204 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3d024a0-38bb-4a46-8b02-3f380b51d144-serving-cert\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921220 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f3d024a0-38bb-4a46-8b02-3f380b51d144-etcd-service-ca\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921236 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23bc52e7-0e50-454f-87ee-b82b608ee34a-config-volume\") pod \"collect-profiles-29322045-6cg7w\" (UID: \"23bc52e7-0e50-454f-87ee-b82b608ee34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921251 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e967d0be-0502-4857-9be0-1a6560362c34-webhook-cert\") pod \"packageserver-d55dfcdfc-cqwct\" (UID: \"e967d0be-0502-4857-9be0-1a6560362c34\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921265 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24e14587-7a28-41eb-9184-668739c10654-metrics-certs\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921280 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f3d024a0-38bb-4a46-8b02-3f380b51d144-etcd-client\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921298 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23bc52e7-0e50-454f-87ee-b82b608ee34a-secret-volume\") pod \"collect-profiles-29322045-6cg7w\" (UID: \"23bc52e7-0e50-454f-87ee-b82b608ee34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921337 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7ff570b-9b52-4d2f-b030-40eca1804794-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bhp8c\" (UID: \"c7ff570b-9b52-4d2f-b030-40eca1804794\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bhp8c" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921355 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpnc7\" (UniqueName: \"kubernetes.io/projected/c7ff570b-9b52-4d2f-b030-40eca1804794-kube-api-access-mpnc7\") pod \"control-plane-machine-set-operator-78cbb6b69f-bhp8c\" (UID: \"c7ff570b-9b52-4d2f-b030-40eca1804794\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bhp8c" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921372 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9318518d-b104-4804-86ed-0b53f1c92c44-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6p6zn\" (UID: \"9318518d-b104-4804-86ed-0b53f1c92c44\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921389 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzxg8\" (UniqueName: \"kubernetes.io/projected/23bc52e7-0e50-454f-87ee-b82b608ee34a-kube-api-access-zzxg8\") pod \"collect-profiles-29322045-6cg7w\" (UID: \"23bc52e7-0e50-454f-87ee-b82b608ee34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921404 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5648b4bb-f782-4788-b87a-b484fbd1f375-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c9zt4\" (UID: \"5648b4bb-f782-4788-b87a-b484fbd1f375\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921419 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-config\") pod \"service-ca-operator-777779d784-lmpr6\" (UID: \"f541a9af-1ddc-4f57-b6e9-2b1c5baa9953\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921435 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5648b4bb-f782-4788-b87a-b484fbd1f375-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c9zt4\" (UID: \"5648b4bb-f782-4788-b87a-b484fbd1f375\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921449 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-serving-cert\") pod \"service-ca-operator-777779d784-lmpr6\" (UID: \"f541a9af-1ddc-4f57-b6e9-2b1c5baa9953\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921470 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs7gx\" (UniqueName: \"kubernetes.io/projected/9318518d-b104-4804-86ed-0b53f1c92c44-kube-api-access-gs7gx\") pod \"openshift-controller-manager-operator-756b6f6bc6-6p6zn\" (UID: \"9318518d-b104-4804-86ed-0b53f1c92c44\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921491 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24e14587-7a28-41eb-9184-668739c10654-service-ca-bundle\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.921520 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8002697d-726f-49e7-830c-c64083912932-signing-key\") pod \"service-ca-9c57cc56f-8dv98\" (UID: \"8002697d-726f-49e7-830c-c64083912932\") " pod="openshift-service-ca/service-ca-9c57cc56f-8dv98" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.922111 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e967d0be-0502-4857-9be0-1a6560362c34-tmpfs\") pod \"packageserver-d55dfcdfc-cqwct\" (UID: \"e967d0be-0502-4857-9be0-1a6560362c34\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.923872 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.924387 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9318518d-b104-4804-86ed-0b53f1c92c44-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6p6zn\" (UID: \"9318518d-b104-4804-86ed-0b53f1c92c44\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.924472 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24e14587-7a28-41eb-9184-668739c10654-service-ca-bundle\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.926587 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/24e14587-7a28-41eb-9184-668739c10654-default-certificate\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.928387 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7ff570b-9b52-4d2f-b030-40eca1804794-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bhp8c\" (UID: \"c7ff570b-9b52-4d2f-b030-40eca1804794\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bhp8c" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.928927 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/24e14587-7a28-41eb-9184-668739c10654-stats-auth\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.930142 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24e14587-7a28-41eb-9184-668739c10654-metrics-certs\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.938403 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9318518d-b104-4804-86ed-0b53f1c92c44-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6p6zn\" (UID: \"9318518d-b104-4804-86ed-0b53f1c92c44\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.943605 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.964334 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 01 12:55:31 crc kubenswrapper[4851]: I1001 12:55:31.983337 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.003838 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.024230 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.044104 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.063184 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.068742 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5648b4bb-f782-4788-b87a-b484fbd1f375-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c9zt4\" (UID: \"5648b4bb-f782-4788-b87a-b484fbd1f375\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.083457 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.094758 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5648b4bb-f782-4788-b87a-b484fbd1f375-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c9zt4\" (UID: \"5648b4bb-f782-4788-b87a-b484fbd1f375\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.103932 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.124431 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.144174 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.163707 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.184562 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.204969 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.224447 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.244731 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.256661 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d07660b5-fafb-43c6-89a0-eb5a359eb314-serving-cert\") pod \"console-operator-58897d9998-mvb7b\" (UID: \"d07660b5-fafb-43c6-89a0-eb5a359eb314\") " pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.263836 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.274272 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07660b5-fafb-43c6-89a0-eb5a359eb314-config\") pod \"console-operator-58897d9998-mvb7b\" (UID: \"d07660b5-fafb-43c6-89a0-eb5a359eb314\") " pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.293749 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.303761 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.305539 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d07660b5-fafb-43c6-89a0-eb5a359eb314-trusted-ca\") pod \"console-operator-58897d9998-mvb7b\" (UID: \"d07660b5-fafb-43c6-89a0-eb5a359eb314\") " pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.324051 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.340470 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f3d024a0-38bb-4a46-8b02-3f380b51d144-etcd-client\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.343817 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.364820 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.377563 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3d024a0-38bb-4a46-8b02-3f380b51d144-serving-cert\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.385131 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.394186 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d024a0-38bb-4a46-8b02-3f380b51d144-config\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.404325 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.413046 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f3d024a0-38bb-4a46-8b02-3f380b51d144-etcd-ca\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.427737 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.434149 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f3d024a0-38bb-4a46-8b02-3f380b51d144-etcd-service-ca\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.445087 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.463809 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.485867 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.504266 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.524213 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.544218 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.564186 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.584916 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.603996 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.624367 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.642718 4851 request.go:700] Waited for 1.007063948s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/configmaps?fieldSelector=metadata.name%3Dmachine-config-operator-images&limit=500&resourceVersion=0 Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.644701 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.665337 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.683717 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.704302 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.724743 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.744569 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.758608 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8002697d-726f-49e7-830c-c64083912932-signing-key\") pod \"service-ca-9c57cc56f-8dv98\" (UID: \"8002697d-726f-49e7-830c-c64083912932\") " pod="openshift-service-ca/service-ca-9c57cc56f-8dv98" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.764010 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.773852 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8002697d-726f-49e7-830c-c64083912932-signing-cabundle\") pod \"service-ca-9c57cc56f-8dv98\" (UID: \"8002697d-726f-49e7-830c-c64083912932\") " pod="openshift-service-ca/service-ca-9c57cc56f-8dv98" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.784280 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.824798 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.844066 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.863475 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.878063 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e967d0be-0502-4857-9be0-1a6560362c34-apiservice-cert\") pod \"packageserver-d55dfcdfc-cqwct\" (UID: \"e967d0be-0502-4857-9be0-1a6560362c34\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.878813 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e967d0be-0502-4857-9be0-1a6560362c34-webhook-cert\") pod \"packageserver-d55dfcdfc-cqwct\" (UID: \"e967d0be-0502-4857-9be0-1a6560362c34\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.883927 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.904776 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 01 12:55:32 crc kubenswrapper[4851]: E1001 12:55:32.923158 4851 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Oct 01 12:55:32 crc kubenswrapper[4851]: E1001 12:55:32.923291 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23bc52e7-0e50-454f-87ee-b82b608ee34a-config-volume podName:23bc52e7-0e50-454f-87ee-b82b608ee34a nodeName:}" failed. No retries permitted until 2025-10-01 12:55:33.423254385 +0000 UTC m=+141.768371951 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/23bc52e7-0e50-454f-87ee-b82b608ee34a-config-volume") pod "collect-profiles-29322045-6cg7w" (UID: "23bc52e7-0e50-454f-87ee-b82b608ee34a") : failed to sync configmap cache: timed out waiting for the condition Oct 01 12:55:32 crc kubenswrapper[4851]: E1001 12:55:32.923658 4851 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Oct 01 12:55:32 crc kubenswrapper[4851]: E1001 12:55:32.923758 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-config podName:f541a9af-1ddc-4f57-b6e9-2b1c5baa9953 nodeName:}" failed. No retries permitted until 2025-10-01 12:55:33.423733148 +0000 UTC m=+141.768850674 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-config") pod "service-ca-operator-777779d784-lmpr6" (UID: "f541a9af-1ddc-4f57-b6e9-2b1c5baa9953") : failed to sync configmap cache: timed out waiting for the condition Oct 01 12:55:32 crc kubenswrapper[4851]: E1001 12:55:32.923806 4851 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 12:55:32 crc kubenswrapper[4851]: E1001 12:55:32.923849 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23bc52e7-0e50-454f-87ee-b82b608ee34a-secret-volume podName:23bc52e7-0e50-454f-87ee-b82b608ee34a nodeName:}" failed. No retries permitted until 2025-10-01 12:55:33.423836551 +0000 UTC m=+141.768954077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-volume" (UniqueName: "kubernetes.io/secret/23bc52e7-0e50-454f-87ee-b82b608ee34a-secret-volume") pod "collect-profiles-29322045-6cg7w" (UID: "23bc52e7-0e50-454f-87ee-b82b608ee34a") : failed to sync secret cache: timed out waiting for the condition Oct 01 12:55:32 crc kubenswrapper[4851]: E1001 12:55:32.923873 4851 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 12:55:32 crc kubenswrapper[4851]: E1001 12:55:32.923954 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-serving-cert podName:f541a9af-1ddc-4f57-b6e9-2b1c5baa9953 nodeName:}" failed. No retries permitted until 2025-10-01 12:55:33.423929924 +0000 UTC m=+141.769047440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-serving-cert") pod "service-ca-operator-777779d784-lmpr6" (UID: "f541a9af-1ddc-4f57-b6e9-2b1c5baa9953") : failed to sync secret cache: timed out waiting for the condition Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.925140 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.944178 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.964020 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 01 12:55:32 crc kubenswrapper[4851]: I1001 12:55:32.984429 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.004399 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.024108 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.044184 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.064243 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.083384 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.104496 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.122905 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.144090 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.164097 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.184690 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.211570 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.224055 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.244035 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.264130 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.283885 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.323791 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.328418 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7trkq\" (UniqueName: \"kubernetes.io/projected/c557af36-f061-43db-9a5b-e88e8df92493-kube-api-access-7trkq\") pod \"apiserver-76f77b778f-v2v8j\" (UID: \"c557af36-f061-43db-9a5b-e88e8df92493\") " pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.344408 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.364074 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.433875 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krwjm\" (UniqueName: \"kubernetes.io/projected/86146cb0-0a54-4309-8345-565e2da39442-kube-api-access-krwjm\") pod \"oauth-openshift-558db77b4-6z8fq\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.445384 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23bc52e7-0e50-454f-87ee-b82b608ee34a-config-volume\") pod \"collect-profiles-29322045-6cg7w\" (UID: \"23bc52e7-0e50-454f-87ee-b82b608ee34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.445443 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23bc52e7-0e50-454f-87ee-b82b608ee34a-secret-volume\") pod \"collect-profiles-29322045-6cg7w\" (UID: \"23bc52e7-0e50-454f-87ee-b82b608ee34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.445489 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-config\") pod \"service-ca-operator-777779d784-lmpr6\" (UID: \"f541a9af-1ddc-4f57-b6e9-2b1c5baa9953\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.445550 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-serving-cert\") pod \"service-ca-operator-777779d784-lmpr6\" (UID: \"f541a9af-1ddc-4f57-b6e9-2b1c5baa9953\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.446640 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-config\") pod \"service-ca-operator-777779d784-lmpr6\" (UID: \"f541a9af-1ddc-4f57-b6e9-2b1c5baa9953\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.446886 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23bc52e7-0e50-454f-87ee-b82b608ee34a-config-volume\") pod \"collect-profiles-29322045-6cg7w\" (UID: \"23bc52e7-0e50-454f-87ee-b82b608ee34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.450444 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23bc52e7-0e50-454f-87ee-b82b608ee34a-secret-volume\") pod \"collect-profiles-29322045-6cg7w\" (UID: \"23bc52e7-0e50-454f-87ee-b82b608ee34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.451420 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc5m4\" (UniqueName: \"kubernetes.io/projected/2e0769b5-2ba5-4464-b580-ec55796e5ea1-kube-api-access-lc5m4\") pod \"cluster-image-registry-operator-dc59b4c8b-2qlz9\" (UID: \"2e0769b5-2ba5-4464-b580-ec55796e5ea1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.451931 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-serving-cert\") pod \"service-ca-operator-777779d784-lmpr6\" (UID: \"f541a9af-1ddc-4f57-b6e9-2b1c5baa9953\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.472081 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4pkj\" (UniqueName: \"kubernetes.io/projected/0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697-kube-api-access-q4pkj\") pod \"machine-api-operator-5694c8668f-68zzg\" (UID: \"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.485478 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e0769b5-2ba5-4464-b580-ec55796e5ea1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2qlz9\" (UID: \"2e0769b5-2ba5-4464-b580-ec55796e5ea1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.510987 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsfvp\" (UniqueName: \"kubernetes.io/projected/327992e6-37f8-487b-915b-b633070aece3-kube-api-access-wsfvp\") pod \"apiserver-7bbb656c7d-wwhr7\" (UID: \"327992e6-37f8-487b-915b-b633070aece3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.530748 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkkqh\" (UniqueName: \"kubernetes.io/projected/b5b5efe6-729a-431f-b8cd-67562ec18593-kube-api-access-xkkqh\") pod \"console-f9d7485db-xjvqh\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.544349 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.551755 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7fss\" (UniqueName: \"kubernetes.io/projected/f6538b69-30fc-4cc0-80da-62537b61f41f-kube-api-access-g7fss\") pod \"controller-manager-879f6c89f-k7rvb\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.563938 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.584030 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.604011 4851 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.623723 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.624711 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.633771 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.644077 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.659025 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.661870 4851 request.go:700] Waited for 1.850830353s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.664454 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.671179 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.685038 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.704352 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.714627 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.724198 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.729972 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.741805 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.776060 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpq97\" (UniqueName: \"kubernetes.io/projected/f3d024a0-38bb-4a46-8b02-3f380b51d144-kube-api-access-hpq97\") pod \"etcd-operator-b45778765-wmgkn\" (UID: \"f3d024a0-38bb-4a46-8b02-3f380b51d144\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.789310 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vntpt\" (UniqueName: \"kubernetes.io/projected/d07660b5-fafb-43c6-89a0-eb5a359eb314-kube-api-access-vntpt\") pod \"console-operator-58897d9998-mvb7b\" (UID: \"d07660b5-fafb-43c6-89a0-eb5a359eb314\") " pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.808400 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdxr5\" (UniqueName: \"kubernetes.io/projected/e967d0be-0502-4857-9be0-1a6560362c34-kube-api-access-zdxr5\") pod \"packageserver-d55dfcdfc-cqwct\" (UID: \"e967d0be-0502-4857-9be0-1a6560362c34\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.828120 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkzv4\" (UniqueName: \"kubernetes.io/projected/f541a9af-1ddc-4f57-b6e9-2b1c5baa9953-kube-api-access-dkzv4\") pod \"service-ca-operator-777779d784-lmpr6\" (UID: \"f541a9af-1ddc-4f57-b6e9-2b1c5baa9953\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.842188 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57l6k\" (UniqueName: \"kubernetes.io/projected/8002697d-726f-49e7-830c-c64083912932-kube-api-access-57l6k\") pod \"service-ca-9c57cc56f-8dv98\" (UID: \"8002697d-726f-49e7-830c-c64083912932\") " pod="openshift-service-ca/service-ca-9c57cc56f-8dv98" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.876026 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-778g7\" (UniqueName: \"kubernetes.io/projected/24e14587-7a28-41eb-9184-668739c10654-kube-api-access-778g7\") pod \"router-default-5444994796-qtlzx\" (UID: \"24e14587-7a28-41eb-9184-668739c10654\") " pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.885020 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5648b4bb-f782-4788-b87a-b484fbd1f375-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c9zt4\" (UID: \"5648b4bb-f782-4788-b87a-b484fbd1f375\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.897212 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpnc7\" (UniqueName: \"kubernetes.io/projected/c7ff570b-9b52-4d2f-b030-40eca1804794-kube-api-access-mpnc7\") pod \"control-plane-machine-set-operator-78cbb6b69f-bhp8c\" (UID: \"c7ff570b-9b52-4d2f-b030-40eca1804794\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bhp8c" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.903907 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.917051 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs7gx\" (UniqueName: \"kubernetes.io/projected/9318518d-b104-4804-86ed-0b53f1c92c44-kube-api-access-gs7gx\") pod \"openshift-controller-manager-operator-756b6f6bc6-6p6zn\" (UID: \"9318518d-b104-4804-86ed-0b53f1c92c44\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn" Oct 01 12:55:33 crc kubenswrapper[4851]: W1001 12:55:33.918955 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24e14587_7a28_41eb_9184_668739c10654.slice/crio-4fa0b75f70240103c6a1ca19f2877c6a557ee62631117340c44d5258ca11fd67 WatchSource:0}: Error finding container 4fa0b75f70240103c6a1ca19f2877c6a557ee62631117340c44d5258ca11fd67: Status 404 returned error can't find the container with id 4fa0b75f70240103c6a1ca19f2877c6a557ee62631117340c44d5258ca11fd67 Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.935745 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.940477 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.941119 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzxg8\" (UniqueName: \"kubernetes.io/projected/23bc52e7-0e50-454f-87ee-b82b608ee34a-kube-api-access-zzxg8\") pod \"collect-profiles-29322045-6cg7w\" (UID: \"23bc52e7-0e50-454f-87ee-b82b608ee34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.945219 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k7rvb"] Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.953647 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.962940 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.980888 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:33 crc kubenswrapper[4851]: I1001 12:55:33.994750 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8dv98" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.014922 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.025792 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-v2v8j"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.053877 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054716 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a54d2004-e4e0-4b2c-b026-e82a6f241da7-registry-certificates\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054744 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8de9ba8e-0c67-49d3-becd-c845dbba7335-bound-sa-token\") pod \"ingress-operator-5b745b69d9-72dqh\" (UID: \"8de9ba8e-0c67-49d3-becd-c845dbba7335\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054762 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a5b1d979-1dee-401d-89e9-92314369ca7d-srv-cert\") pod \"catalog-operator-68c6474976-28f4n\" (UID: \"a5b1d979-1dee-401d-89e9-92314369ca7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054777 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aef587e8-7f95-42cd-9a4f-10b1024abef0-proxy-tls\") pod \"machine-config-operator-74547568cd-nk8rb\" (UID: \"aef587e8-7f95-42cd-9a4f-10b1024abef0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054796 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltnk8\" (UniqueName: \"kubernetes.io/projected/68776259-af0d-442c-b3e5-5e242a7b8a9f-kube-api-access-ltnk8\") pod \"machine-approver-56656f9798-b7tcr\" (UID: \"68776259-af0d-442c-b3e5-5e242a7b8a9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054815 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d8acc9e-4d2b-4118-9e90-d95ce539747a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2kfkj\" (UID: \"3d8acc9e-4d2b-4118-9e90-d95ce539747a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054833 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prtfr\" (UniqueName: \"kubernetes.io/projected/3ed2545b-85c0-4141-819a-18c221911af9-kube-api-access-prtfr\") pod \"multus-admission-controller-857f4d67dd-b76sz\" (UID: \"3ed2545b-85c0-4141-819a-18c221911af9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b76sz" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054848 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ldmk\" (UniqueName: \"kubernetes.io/projected/aef587e8-7f95-42cd-9a4f-10b1024abef0-kube-api-access-5ldmk\") pod \"machine-config-operator-74547568cd-nk8rb\" (UID: \"aef587e8-7f95-42cd-9a4f-10b1024abef0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054883 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmhlh\" (UniqueName: \"kubernetes.io/projected/3d8acc9e-4d2b-4118-9e90-d95ce539747a-kube-api-access-jmhlh\") pod \"package-server-manager-789f6589d5-2kfkj\" (UID: \"3d8acc9e-4d2b-4118-9e90-d95ce539747a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054897 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-client-ca\") pod \"route-controller-manager-6576b87f9c-rgkzd\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054922 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-bound-sa-token\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054939 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00479b8-ba27-494b-907d-a4a201ae4be1-config\") pod \"kube-controller-manager-operator-78b949d7b-jqn7q\" (UID: \"e00479b8-ba27-494b-907d-a4a201ae4be1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054955 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8de9ba8e-0c67-49d3-becd-c845dbba7335-metrics-tls\") pod \"ingress-operator-5b745b69d9-72dqh\" (UID: \"8de9ba8e-0c67-49d3-becd-c845dbba7335\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054973 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4h96\" (UniqueName: \"kubernetes.io/projected/d8e7ea87-4895-4f48-b774-e337103d3951-kube-api-access-x4h96\") pod \"openshift-config-operator-7777fb866f-x5dht\" (UID: \"d8e7ea87-4895-4f48-b774-e337103d3951\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.054989 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aef587e8-7f95-42cd-9a4f-10b1024abef0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nk8rb\" (UID: \"aef587e8-7f95-42cd-9a4f-10b1024abef0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.055003 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68776259-af0d-442c-b3e5-5e242a7b8a9f-auth-proxy-config\") pod \"machine-approver-56656f9798-b7tcr\" (UID: \"68776259-af0d-442c-b3e5-5e242a7b8a9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.055017 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a93ae38a-3f2c-42bf-9da9-d30f439a371f-srv-cert\") pod \"olm-operator-6b444d44fb-dq7rl\" (UID: \"a93ae38a-3f2c-42bf-9da9-d30f439a371f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.055032 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0266e2f-59eb-4610-beef-77397d4c297f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.055047 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d8e7ea87-4895-4f48-b774-e337103d3951-available-featuregates\") pod \"openshift-config-operator-7777fb866f-x5dht\" (UID: \"d8e7ea87-4895-4f48-b774-e337103d3951\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.055063 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnvbd\" (UniqueName: \"kubernetes.io/projected/68c1357c-a93f-4ecc-9743-2bdf982607c2-kube-api-access-rnvbd\") pod \"dns-operator-744455d44c-6ccbw\" (UID: \"68c1357c-a93f-4ecc-9743-2bdf982607c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-6ccbw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.055077 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7hnt8\" (UID: \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.055093 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87722bcd-9eeb-4b6b-ba53-b6283053a06f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mffkh\" (UID: \"87722bcd-9eeb-4b6b-ba53-b6283053a06f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.055107 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0266e2f-59eb-4610-beef-77397d4c297f-config\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.055121 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88g9p\" (UniqueName: \"kubernetes.io/projected/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-kube-api-access-88g9p\") pod \"marketplace-operator-79b997595-7hnt8\" (UID: \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.055152 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/91e48130-a450-4f7e-aa07-bfcaa13aae66-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fv9xq\" (UID: \"91e48130-a450-4f7e-aa07-bfcaa13aae66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.055169 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-registry-tls\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.055184 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8de9ba8e-0c67-49d3-becd-c845dbba7335-trusted-ca\") pod \"ingress-operator-5b745b69d9-72dqh\" (UID: \"8de9ba8e-0c67-49d3-becd-c845dbba7335\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.055201 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d1f1318-7c6a-42ea-a987-c419603e81bf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fclxw\" (UID: \"9d1f1318-7c6a-42ea-a987-c419603e81bf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.057528 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mss5\" (UniqueName: \"kubernetes.io/projected/c0266e2f-59eb-4610-beef-77397d4c297f-kube-api-access-6mss5\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.057555 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7hnt8\" (UID: \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.057583 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0266e2f-59eb-4610-beef-77397d4c297f-service-ca-bundle\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.057622 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a54d2004-e4e0-4b2c-b026-e82a6f241da7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.057644 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1eb7909a-cc95-480b-bd8d-e1f6092f3baa-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7wzb9\" (UID: \"1eb7909a-cc95-480b-bd8d-e1f6092f3baa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.057951 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh9hv\" (UniqueName: \"kubernetes.io/projected/dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d-kube-api-access-vh9hv\") pod \"cluster-samples-operator-665b6dd947-tbg25\" (UID: \"dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.058000 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68776259-af0d-442c-b3e5-5e242a7b8a9f-config\") pod \"machine-approver-56656f9798-b7tcr\" (UID: \"68776259-af0d-442c-b3e5-5e242a7b8a9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.058038 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sxw6\" (UniqueName: \"kubernetes.io/projected/a5b1d979-1dee-401d-89e9-92314369ca7d-kube-api-access-2sxw6\") pod \"catalog-operator-68c6474976-28f4n\" (UID: \"a5b1d979-1dee-401d-89e9-92314369ca7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.058386 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00931d70-10d2-42f3-83ff-6a25098df202-config-volume\") pod \"dns-default-t26f4\" (UID: \"00931d70-10d2-42f3-83ff-6a25098df202\") " pod="openshift-dns/dns-default-t26f4" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.058416 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91e48130-a450-4f7e-aa07-bfcaa13aae66-proxy-tls\") pod \"machine-config-controller-84d6567774-fv9xq\" (UID: \"91e48130-a450-4f7e-aa07-bfcaa13aae66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.058435 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87722bcd-9eeb-4b6b-ba53-b6283053a06f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mffkh\" (UID: \"87722bcd-9eeb-4b6b-ba53-b6283053a06f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.058457 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/aef587e8-7f95-42cd-9a4f-10b1024abef0-images\") pod \"machine-config-operator-74547568cd-nk8rb\" (UID: \"aef587e8-7f95-42cd-9a4f-10b1024abef0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.058515 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3ed2545b-85c0-4141-819a-18c221911af9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-b76sz\" (UID: \"3ed2545b-85c0-4141-819a-18c221911af9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b76sz" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.058662 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0266e2f-59eb-4610-beef-77397d4c297f-serving-cert\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.058699 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a54d2004-e4e0-4b2c-b026-e82a6f241da7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.058725 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp5kp\" (UniqueName: \"kubernetes.io/projected/8de9ba8e-0c67-49d3-becd-c845dbba7335-kube-api-access-qp5kp\") pod \"ingress-operator-5b745b69d9-72dqh\" (UID: \"8de9ba8e-0c67-49d3-becd-c845dbba7335\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.058744 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prg7d\" (UniqueName: \"kubernetes.io/projected/9d1f1318-7c6a-42ea-a987-c419603e81bf-kube-api-access-prg7d\") pod \"kube-storage-version-migrator-operator-b67b599dd-fclxw\" (UID: \"9d1f1318-7c6a-42ea-a987-c419603e81bf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.058978 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00479b8-ba27-494b-907d-a4a201ae4be1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jqn7q\" (UID: \"e00479b8-ba27-494b-907d-a4a201ae4be1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059008 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/68776259-af0d-442c-b3e5-5e242a7b8a9f-machine-approver-tls\") pod \"machine-approver-56656f9798-b7tcr\" (UID: \"68776259-af0d-442c-b3e5-5e242a7b8a9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059023 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-config\") pod \"route-controller-manager-6576b87f9c-rgkzd\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059038 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e7ea87-4895-4f48-b774-e337103d3951-serving-cert\") pod \"openshift-config-operator-7777fb866f-x5dht\" (UID: \"d8e7ea87-4895-4f48-b774-e337103d3951\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059061 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9g46\" (UniqueName: \"kubernetes.io/projected/928cc097-331d-49d7-8a9d-957e2032f941-kube-api-access-d9g46\") pod \"downloads-7954f5f757-qcsbq\" (UID: \"928cc097-331d-49d7-8a9d-957e2032f941\") " pod="openshift-console/downloads-7954f5f757-qcsbq" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059083 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00931d70-10d2-42f3-83ff-6a25098df202-metrics-tls\") pod \"dns-default-t26f4\" (UID: \"00931d70-10d2-42f3-83ff-6a25098df202\") " pod="openshift-dns/dns-default-t26f4" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059099 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs8cj\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-kube-api-access-fs8cj\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059142 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1eb7909a-cc95-480b-bd8d-e1f6092f3baa-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7wzb9\" (UID: \"1eb7909a-cc95-480b-bd8d-e1f6092f3baa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059156 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-serving-cert\") pod \"route-controller-manager-6576b87f9c-rgkzd\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059173 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqr49\" (UniqueName: \"kubernetes.io/projected/00931d70-10d2-42f3-83ff-6a25098df202-kube-api-access-dqr49\") pod \"dns-default-t26f4\" (UID: \"00931d70-10d2-42f3-83ff-6a25098df202\") " pod="openshift-dns/dns-default-t26f4" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059326 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d1f1318-7c6a-42ea-a987-c419603e81bf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fclxw\" (UID: \"9d1f1318-7c6a-42ea-a987-c419603e81bf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059348 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/68c1357c-a93f-4ecc-9743-2bdf982607c2-metrics-tls\") pod \"dns-operator-744455d44c-6ccbw\" (UID: \"68c1357c-a93f-4ecc-9743-2bdf982607c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-6ccbw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059374 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059422 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a54d2004-e4e0-4b2c-b026-e82a6f241da7-trusted-ca\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: E1001 12:55:34.059625 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:34.559613905 +0000 UTC m=+142.904731391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059939 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s2qb\" (UniqueName: \"kubernetes.io/projected/45efbf96-913d-4b37-87dc-0ff84f96257a-kube-api-access-8s2qb\") pod \"migrator-59844c95c7-qzxpg\" (UID: \"45efbf96-913d-4b37-87dc-0ff84f96257a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qzxpg" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059956 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hppgp\" (UniqueName: \"kubernetes.io/projected/a93ae38a-3f2c-42bf-9da9-d30f439a371f-kube-api-access-hppgp\") pod \"olm-operator-6b444d44fb-dq7rl\" (UID: \"a93ae38a-3f2c-42bf-9da9-d30f439a371f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.059986 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a93ae38a-3f2c-42bf-9da9-d30f439a371f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dq7rl\" (UID: \"a93ae38a-3f2c-42bf-9da9-d30f439a371f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.060031 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wnjr\" (UniqueName: \"kubernetes.io/projected/87722bcd-9eeb-4b6b-ba53-b6283053a06f-kube-api-access-2wnjr\") pod \"openshift-apiserver-operator-796bbdcf4f-mffkh\" (UID: \"87722bcd-9eeb-4b6b-ba53-b6283053a06f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.060046 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e00479b8-ba27-494b-907d-a4a201ae4be1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jqn7q\" (UID: \"e00479b8-ba27-494b-907d-a4a201ae4be1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.060062 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhgmn\" (UniqueName: \"kubernetes.io/projected/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-kube-api-access-zhgmn\") pod \"route-controller-manager-6576b87f9c-rgkzd\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.060082 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tbg25\" (UID: \"dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.060095 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eb7909a-cc95-480b-bd8d-e1f6092f3baa-config\") pod \"kube-apiserver-operator-766d6c64bb-7wzb9\" (UID: \"1eb7909a-cc95-480b-bd8d-e1f6092f3baa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.060139 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlvh9\" (UniqueName: \"kubernetes.io/projected/91e48130-a450-4f7e-aa07-bfcaa13aae66-kube-api-access-xlvh9\") pod \"machine-config-controller-84d6567774-fv9xq\" (UID: \"91e48130-a450-4f7e-aa07-bfcaa13aae66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.060158 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a5b1d979-1dee-401d-89e9-92314369ca7d-profile-collector-cert\") pod \"catalog-operator-68c6474976-28f4n\" (UID: \"a5b1d979-1dee-401d-89e9-92314369ca7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.062184 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.125779 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-68zzg"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.128613 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6z8fq"] Oct 01 12:55:34 crc kubenswrapper[4851]: W1001 12:55:34.131962 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed3ddbf_1afc_4704_9ad0_29ae4e3a6697.slice/crio-961bae1a3f006d9939a8fb034aa8548d0e0257ab11c3644c60d189a4af1d9dca WatchSource:0}: Error finding container 961bae1a3f006d9939a8fb034aa8548d0e0257ab11c3644c60d189a4af1d9dca: Status 404 returned error can't find the container with id 961bae1a3f006d9939a8fb034aa8548d0e0257ab11c3644c60d189a4af1d9dca Oct 01 12:55:34 crc kubenswrapper[4851]: W1001 12:55:34.159732 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86146cb0_0a54_4309_8345_565e2da39442.slice/crio-f4005abaf17d20fb4518a3ea738bcc70e639fbbe57dcd3f12f2e659b106c2073 WatchSource:0}: Error finding container f4005abaf17d20fb4518a3ea738bcc70e639fbbe57dcd3f12f2e659b106c2073: Status 404 returned error can't find the container with id f4005abaf17d20fb4518a3ea738bcc70e639fbbe57dcd3f12f2e659b106c2073 Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161346 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161696 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fe62c67c-9803-4d61-a882-9406db825f8a-node-bootstrap-token\") pod \"machine-config-server-f6l2s\" (UID: \"fe62c67c-9803-4d61-a882-9406db825f8a\") " pod="openshift-machine-config-operator/machine-config-server-f6l2s" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161725 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a5b1d979-1dee-401d-89e9-92314369ca7d-srv-cert\") pod \"catalog-operator-68c6474976-28f4n\" (UID: \"a5b1d979-1dee-401d-89e9-92314369ca7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161748 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aef587e8-7f95-42cd-9a4f-10b1024abef0-proxy-tls\") pod \"machine-config-operator-74547568cd-nk8rb\" (UID: \"aef587e8-7f95-42cd-9a4f-10b1024abef0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161767 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-csi-data-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161795 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltnk8\" (UniqueName: \"kubernetes.io/projected/68776259-af0d-442c-b3e5-5e242a7b8a9f-kube-api-access-ltnk8\") pod \"machine-approver-56656f9798-b7tcr\" (UID: \"68776259-af0d-442c-b3e5-5e242a7b8a9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161813 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d8acc9e-4d2b-4118-9e90-d95ce539747a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2kfkj\" (UID: \"3d8acc9e-4d2b-4118-9e90-d95ce539747a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161832 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prtfr\" (UniqueName: \"kubernetes.io/projected/3ed2545b-85c0-4141-819a-18c221911af9-kube-api-access-prtfr\") pod \"multus-admission-controller-857f4d67dd-b76sz\" (UID: \"3ed2545b-85c0-4141-819a-18c221911af9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b76sz" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161858 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmhlh\" (UniqueName: \"kubernetes.io/projected/3d8acc9e-4d2b-4118-9e90-d95ce539747a-kube-api-access-jmhlh\") pod \"package-server-manager-789f6589d5-2kfkj\" (UID: \"3d8acc9e-4d2b-4118-9e90-d95ce539747a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161873 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ldmk\" (UniqueName: \"kubernetes.io/projected/aef587e8-7f95-42cd-9a4f-10b1024abef0-kube-api-access-5ldmk\") pod \"machine-config-operator-74547568cd-nk8rb\" (UID: \"aef587e8-7f95-42cd-9a4f-10b1024abef0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161898 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-client-ca\") pod \"route-controller-manager-6576b87f9c-rgkzd\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161913 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-bound-sa-token\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161930 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00479b8-ba27-494b-907d-a4a201ae4be1-config\") pod \"kube-controller-manager-operator-78b949d7b-jqn7q\" (UID: \"e00479b8-ba27-494b-907d-a4a201ae4be1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161946 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8de9ba8e-0c67-49d3-becd-c845dbba7335-metrics-tls\") pod \"ingress-operator-5b745b69d9-72dqh\" (UID: \"8de9ba8e-0c67-49d3-becd-c845dbba7335\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.161987 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aef587e8-7f95-42cd-9a4f-10b1024abef0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nk8rb\" (UID: \"aef587e8-7f95-42cd-9a4f-10b1024abef0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162007 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4h96\" (UniqueName: \"kubernetes.io/projected/d8e7ea87-4895-4f48-b774-e337103d3951-kube-api-access-x4h96\") pod \"openshift-config-operator-7777fb866f-x5dht\" (UID: \"d8e7ea87-4895-4f48-b774-e337103d3951\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162031 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a93ae38a-3f2c-42bf-9da9-d30f439a371f-srv-cert\") pod \"olm-operator-6b444d44fb-dq7rl\" (UID: \"a93ae38a-3f2c-42bf-9da9-d30f439a371f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162047 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68776259-af0d-442c-b3e5-5e242a7b8a9f-auth-proxy-config\") pod \"machine-approver-56656f9798-b7tcr\" (UID: \"68776259-af0d-442c-b3e5-5e242a7b8a9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162086 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d8e7ea87-4895-4f48-b774-e337103d3951-available-featuregates\") pod \"openshift-config-operator-7777fb866f-x5dht\" (UID: \"d8e7ea87-4895-4f48-b774-e337103d3951\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162116 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnvbd\" (UniqueName: \"kubernetes.io/projected/68c1357c-a93f-4ecc-9743-2bdf982607c2-kube-api-access-rnvbd\") pod \"dns-operator-744455d44c-6ccbw\" (UID: \"68c1357c-a93f-4ecc-9743-2bdf982607c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-6ccbw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162152 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0266e2f-59eb-4610-beef-77397d4c297f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162168 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7hnt8\" (UID: \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162184 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fe62c67c-9803-4d61-a882-9406db825f8a-certs\") pod \"machine-config-server-f6l2s\" (UID: \"fe62c67c-9803-4d61-a882-9406db825f8a\") " pod="openshift-machine-config-operator/machine-config-server-f6l2s" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162218 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87722bcd-9eeb-4b6b-ba53-b6283053a06f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mffkh\" (UID: \"87722bcd-9eeb-4b6b-ba53-b6283053a06f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162235 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0266e2f-59eb-4610-beef-77397d4c297f-config\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162252 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88g9p\" (UniqueName: \"kubernetes.io/projected/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-kube-api-access-88g9p\") pod \"marketplace-operator-79b997595-7hnt8\" (UID: \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162270 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/91e48130-a450-4f7e-aa07-bfcaa13aae66-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fv9xq\" (UID: \"91e48130-a450-4f7e-aa07-bfcaa13aae66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162288 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8rxz\" (UniqueName: \"kubernetes.io/projected/fe62c67c-9803-4d61-a882-9406db825f8a-kube-api-access-v8rxz\") pod \"machine-config-server-f6l2s\" (UID: \"fe62c67c-9803-4d61-a882-9406db825f8a\") " pod="openshift-machine-config-operator/machine-config-server-f6l2s" Oct 01 12:55:34 crc kubenswrapper[4851]: E1001 12:55:34.162328 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:34.662299946 +0000 UTC m=+143.007417512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162367 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8de9ba8e-0c67-49d3-becd-c845dbba7335-trusted-ca\") pod \"ingress-operator-5b745b69d9-72dqh\" (UID: \"8de9ba8e-0c67-49d3-becd-c845dbba7335\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162416 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-registry-tls\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162435 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d1f1318-7c6a-42ea-a987-c419603e81bf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fclxw\" (UID: \"9d1f1318-7c6a-42ea-a987-c419603e81bf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162464 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mss5\" (UniqueName: \"kubernetes.io/projected/c0266e2f-59eb-4610-beef-77397d4c297f-kube-api-access-6mss5\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162487 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7hnt8\" (UID: \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162550 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a54d2004-e4e0-4b2c-b026-e82a6f241da7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162571 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1eb7909a-cc95-480b-bd8d-e1f6092f3baa-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7wzb9\" (UID: \"1eb7909a-cc95-480b-bd8d-e1f6092f3baa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162590 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0266e2f-59eb-4610-beef-77397d4c297f-service-ca-bundle\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162628 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-socket-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162665 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-plugins-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162691 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh9hv\" (UniqueName: \"kubernetes.io/projected/dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d-kube-api-access-vh9hv\") pod \"cluster-samples-operator-665b6dd947-tbg25\" (UID: \"dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162709 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68776259-af0d-442c-b3e5-5e242a7b8a9f-config\") pod \"machine-approver-56656f9798-b7tcr\" (UID: \"68776259-af0d-442c-b3e5-5e242a7b8a9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162729 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sxw6\" (UniqueName: \"kubernetes.io/projected/a5b1d979-1dee-401d-89e9-92314369ca7d-kube-api-access-2sxw6\") pod \"catalog-operator-68c6474976-28f4n\" (UID: \"a5b1d979-1dee-401d-89e9-92314369ca7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162773 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00931d70-10d2-42f3-83ff-6a25098df202-config-volume\") pod \"dns-default-t26f4\" (UID: \"00931d70-10d2-42f3-83ff-6a25098df202\") " pod="openshift-dns/dns-default-t26f4" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.162793 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87722bcd-9eeb-4b6b-ba53-b6283053a06f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mffkh\" (UID: \"87722bcd-9eeb-4b6b-ba53-b6283053a06f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163110 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4892ebfa-87d9-4960-9c75-59dbdacfc690-cert\") pod \"ingress-canary-ffhmh\" (UID: \"4892ebfa-87d9-4960-9c75-59dbdacfc690\") " pod="openshift-ingress-canary/ingress-canary-ffhmh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163131 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-client-ca\") pod \"route-controller-manager-6576b87f9c-rgkzd\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163146 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91e48130-a450-4f7e-aa07-bfcaa13aae66-proxy-tls\") pod \"machine-config-controller-84d6567774-fv9xq\" (UID: \"91e48130-a450-4f7e-aa07-bfcaa13aae66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163190 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3ed2545b-85c0-4141-819a-18c221911af9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-b76sz\" (UID: \"3ed2545b-85c0-4141-819a-18c221911af9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b76sz" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163216 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/aef587e8-7f95-42cd-9a4f-10b1024abef0-images\") pod \"machine-config-operator-74547568cd-nk8rb\" (UID: \"aef587e8-7f95-42cd-9a4f-10b1024abef0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163246 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q5k4\" (UniqueName: \"kubernetes.io/projected/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-kube-api-access-8q5k4\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163272 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0266e2f-59eb-4610-beef-77397d4c297f-serving-cert\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163307 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a54d2004-e4e0-4b2c-b026-e82a6f241da7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163348 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp5kp\" (UniqueName: \"kubernetes.io/projected/8de9ba8e-0c67-49d3-becd-c845dbba7335-kube-api-access-qp5kp\") pod \"ingress-operator-5b745b69d9-72dqh\" (UID: \"8de9ba8e-0c67-49d3-becd-c845dbba7335\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163375 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prg7d\" (UniqueName: \"kubernetes.io/projected/9d1f1318-7c6a-42ea-a987-c419603e81bf-kube-api-access-prg7d\") pod \"kube-storage-version-migrator-operator-b67b599dd-fclxw\" (UID: \"9d1f1318-7c6a-42ea-a987-c419603e81bf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163416 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00479b8-ba27-494b-907d-a4a201ae4be1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jqn7q\" (UID: \"e00479b8-ba27-494b-907d-a4a201ae4be1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163477 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/68776259-af0d-442c-b3e5-5e242a7b8a9f-machine-approver-tls\") pod \"machine-approver-56656f9798-b7tcr\" (UID: \"68776259-af0d-442c-b3e5-5e242a7b8a9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163525 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-config\") pod \"route-controller-manager-6576b87f9c-rgkzd\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163550 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e7ea87-4895-4f48-b774-e337103d3951-serving-cert\") pod \"openshift-config-operator-7777fb866f-x5dht\" (UID: \"d8e7ea87-4895-4f48-b774-e337103d3951\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163583 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9g46\" (UniqueName: \"kubernetes.io/projected/928cc097-331d-49d7-8a9d-957e2032f941-kube-api-access-d9g46\") pod \"downloads-7954f5f757-qcsbq\" (UID: \"928cc097-331d-49d7-8a9d-957e2032f941\") " pod="openshift-console/downloads-7954f5f757-qcsbq" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163649 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00931d70-10d2-42f3-83ff-6a25098df202-metrics-tls\") pod \"dns-default-t26f4\" (UID: \"00931d70-10d2-42f3-83ff-6a25098df202\") " pod="openshift-dns/dns-default-t26f4" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163677 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs8cj\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-kube-api-access-fs8cj\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163708 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-serving-cert\") pod \"route-controller-manager-6576b87f9c-rgkzd\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163733 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqr49\" (UniqueName: \"kubernetes.io/projected/00931d70-10d2-42f3-83ff-6a25098df202-kube-api-access-dqr49\") pod \"dns-default-t26f4\" (UID: \"00931d70-10d2-42f3-83ff-6a25098df202\") " pod="openshift-dns/dns-default-t26f4" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163773 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1eb7909a-cc95-480b-bd8d-e1f6092f3baa-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7wzb9\" (UID: \"1eb7909a-cc95-480b-bd8d-e1f6092f3baa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163813 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-mountpoint-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163842 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/68c1357c-a93f-4ecc-9743-2bdf982607c2-metrics-tls\") pod \"dns-operator-744455d44c-6ccbw\" (UID: \"68c1357c-a93f-4ecc-9743-2bdf982607c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-6ccbw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163894 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d1f1318-7c6a-42ea-a987-c419603e81bf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fclxw\" (UID: \"9d1f1318-7c6a-42ea-a987-c419603e81bf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163926 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163953 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a54d2004-e4e0-4b2c-b026-e82a6f241da7-trusted-ca\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163978 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdflk\" (UniqueName: \"kubernetes.io/projected/4892ebfa-87d9-4960-9c75-59dbdacfc690-kube-api-access-tdflk\") pod \"ingress-canary-ffhmh\" (UID: \"4892ebfa-87d9-4960-9c75-59dbdacfc690\") " pod="openshift-ingress-canary/ingress-canary-ffhmh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.164007 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s2qb\" (UniqueName: \"kubernetes.io/projected/45efbf96-913d-4b37-87dc-0ff84f96257a-kube-api-access-8s2qb\") pod \"migrator-59844c95c7-qzxpg\" (UID: \"45efbf96-913d-4b37-87dc-0ff84f96257a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qzxpg" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.164034 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hppgp\" (UniqueName: \"kubernetes.io/projected/a93ae38a-3f2c-42bf-9da9-d30f439a371f-kube-api-access-hppgp\") pod \"olm-operator-6b444d44fb-dq7rl\" (UID: \"a93ae38a-3f2c-42bf-9da9-d30f439a371f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.164057 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wnjr\" (UniqueName: \"kubernetes.io/projected/87722bcd-9eeb-4b6b-ba53-b6283053a06f-kube-api-access-2wnjr\") pod \"openshift-apiserver-operator-796bbdcf4f-mffkh\" (UID: \"87722bcd-9eeb-4b6b-ba53-b6283053a06f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.164081 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e00479b8-ba27-494b-907d-a4a201ae4be1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jqn7q\" (UID: \"e00479b8-ba27-494b-907d-a4a201ae4be1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.164120 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a93ae38a-3f2c-42bf-9da9-d30f439a371f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dq7rl\" (UID: \"a93ae38a-3f2c-42bf-9da9-d30f439a371f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.163650 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00479b8-ba27-494b-907d-a4a201ae4be1-config\") pod \"kube-controller-manager-operator-78b949d7b-jqn7q\" (UID: \"e00479b8-ba27-494b-907d-a4a201ae4be1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.164147 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhgmn\" (UniqueName: \"kubernetes.io/projected/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-kube-api-access-zhgmn\") pod \"route-controller-manager-6576b87f9c-rgkzd\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.164224 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eb7909a-cc95-480b-bd8d-e1f6092f3baa-config\") pod \"kube-apiserver-operator-766d6c64bb-7wzb9\" (UID: \"1eb7909a-cc95-480b-bd8d-e1f6092f3baa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.164257 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-registration-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.164283 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tbg25\" (UID: \"dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.164325 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlvh9\" (UniqueName: \"kubernetes.io/projected/91e48130-a450-4f7e-aa07-bfcaa13aae66-kube-api-access-xlvh9\") pod \"machine-config-controller-84d6567774-fv9xq\" (UID: \"91e48130-a450-4f7e-aa07-bfcaa13aae66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.164345 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a5b1d979-1dee-401d-89e9-92314369ca7d-profile-collector-cert\") pod \"catalog-operator-68c6474976-28f4n\" (UID: \"a5b1d979-1dee-401d-89e9-92314369ca7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.164373 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a54d2004-e4e0-4b2c-b026-e82a6f241da7-registry-certificates\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.164398 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8de9ba8e-0c67-49d3-becd-c845dbba7335-bound-sa-token\") pod \"ingress-operator-5b745b69d9-72dqh\" (UID: \"8de9ba8e-0c67-49d3-becd-c845dbba7335\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.165215 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0266e2f-59eb-4610-beef-77397d4c297f-service-ca-bundle\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.165371 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8de9ba8e-0c67-49d3-becd-c845dbba7335-trusted-ca\") pod \"ingress-operator-5b745b69d9-72dqh\" (UID: \"8de9ba8e-0c67-49d3-becd-c845dbba7335\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.165890 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eb7909a-cc95-480b-bd8d-e1f6092f3baa-config\") pod \"kube-apiserver-operator-766d6c64bb-7wzb9\" (UID: \"1eb7909a-cc95-480b-bd8d-e1f6092f3baa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.166832 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/aef587e8-7f95-42cd-9a4f-10b1024abef0-images\") pod \"machine-config-operator-74547568cd-nk8rb\" (UID: \"aef587e8-7f95-42cd-9a4f-10b1024abef0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.171740 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aef587e8-7f95-42cd-9a4f-10b1024abef0-proxy-tls\") pod \"machine-config-operator-74547568cd-nk8rb\" (UID: \"aef587e8-7f95-42cd-9a4f-10b1024abef0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.173712 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8de9ba8e-0c67-49d3-becd-c845dbba7335-metrics-tls\") pod \"ingress-operator-5b745b69d9-72dqh\" (UID: \"8de9ba8e-0c67-49d3-becd-c845dbba7335\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.173891 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7hnt8\" (UID: \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.175711 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a54d2004-e4e0-4b2c-b026-e82a6f241da7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.176080 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a54d2004-e4e0-4b2c-b026-e82a6f241da7-registry-certificates\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.176787 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d8e7ea87-4895-4f48-b774-e337103d3951-available-featuregates\") pod \"openshift-config-operator-7777fb866f-x5dht\" (UID: \"d8e7ea87-4895-4f48-b774-e337103d3951\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.177784 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aef587e8-7f95-42cd-9a4f-10b1024abef0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nk8rb\" (UID: \"aef587e8-7f95-42cd-9a4f-10b1024abef0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.179673 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87722bcd-9eeb-4b6b-ba53-b6283053a06f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mffkh\" (UID: \"87722bcd-9eeb-4b6b-ba53-b6283053a06f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.180104 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0266e2f-59eb-4610-beef-77397d4c297f-config\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.180912 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/91e48130-a450-4f7e-aa07-bfcaa13aae66-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fv9xq\" (UID: \"91e48130-a450-4f7e-aa07-bfcaa13aae66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.183957 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tbg25\" (UID: \"dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.184462 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a5b1d979-1dee-401d-89e9-92314369ca7d-profile-collector-cert\") pod \"catalog-operator-68c6474976-28f4n\" (UID: \"a5b1d979-1dee-401d-89e9-92314369ca7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.187031 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-registry-tls\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.187187 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68776259-af0d-442c-b3e5-5e242a7b8a9f-config\") pod \"machine-approver-56656f9798-b7tcr\" (UID: \"68776259-af0d-442c-b3e5-5e242a7b8a9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.190378 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68776259-af0d-442c-b3e5-5e242a7b8a9f-auth-proxy-config\") pod \"machine-approver-56656f9798-b7tcr\" (UID: \"68776259-af0d-442c-b3e5-5e242a7b8a9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.190535 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0266e2f-59eb-4610-beef-77397d4c297f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.190706 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3ed2545b-85c0-4141-819a-18c221911af9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-b76sz\" (UID: \"3ed2545b-85c0-4141-819a-18c221911af9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b76sz" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.190760 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91e48130-a450-4f7e-aa07-bfcaa13aae66-proxy-tls\") pod \"machine-config-controller-84d6567774-fv9xq\" (UID: \"91e48130-a450-4f7e-aa07-bfcaa13aae66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.192244 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a93ae38a-3f2c-42bf-9da9-d30f439a371f-srv-cert\") pod \"olm-operator-6b444d44fb-dq7rl\" (UID: \"a93ae38a-3f2c-42bf-9da9-d30f439a371f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.192713 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bhp8c" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.193038 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1eb7909a-cc95-480b-bd8d-e1f6092f3baa-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7wzb9\" (UID: \"1eb7909a-cc95-480b-bd8d-e1f6092f3baa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.193204 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d1f1318-7c6a-42ea-a987-c419603e81bf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fclxw\" (UID: \"9d1f1318-7c6a-42ea-a987-c419603e81bf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.193775 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a93ae38a-3f2c-42bf-9da9-d30f439a371f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dq7rl\" (UID: \"a93ae38a-3f2c-42bf-9da9-d30f439a371f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.193831 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00931d70-10d2-42f3-83ff-6a25098df202-metrics-tls\") pod \"dns-default-t26f4\" (UID: \"00931d70-10d2-42f3-83ff-6a25098df202\") " pod="openshift-dns/dns-default-t26f4" Oct 01 12:55:34 crc kubenswrapper[4851]: E1001 12:55:34.194233 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:34.694215957 +0000 UTC m=+143.039333443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.195964 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-config\") pod \"route-controller-manager-6576b87f9c-rgkzd\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.196457 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00931d70-10d2-42f3-83ff-6a25098df202-config-volume\") pod \"dns-default-t26f4\" (UID: \"00931d70-10d2-42f3-83ff-6a25098df202\") " pod="openshift-dns/dns-default-t26f4" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.198110 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/68c1357c-a93f-4ecc-9743-2bdf982607c2-metrics-tls\") pod \"dns-operator-744455d44c-6ccbw\" (UID: \"68c1357c-a93f-4ecc-9743-2bdf982607c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-6ccbw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.203068 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e7ea87-4895-4f48-b774-e337103d3951-serving-cert\") pod \"openshift-config-operator-7777fb866f-x5dht\" (UID: \"d8e7ea87-4895-4f48-b774-e337103d3951\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.203351 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a5b1d979-1dee-401d-89e9-92314369ca7d-srv-cert\") pod \"catalog-operator-68c6474976-28f4n\" (UID: \"a5b1d979-1dee-401d-89e9-92314369ca7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.206964 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d1f1318-7c6a-42ea-a987-c419603e81bf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fclxw\" (UID: \"9d1f1318-7c6a-42ea-a987-c419603e81bf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.207331 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7hnt8\" (UID: \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.207855 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00479b8-ba27-494b-907d-a4a201ae4be1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jqn7q\" (UID: \"e00479b8-ba27-494b-907d-a4a201ae4be1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.216734 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/68776259-af0d-442c-b3e5-5e242a7b8a9f-machine-approver-tls\") pod \"machine-approver-56656f9798-b7tcr\" (UID: \"68776259-af0d-442c-b3e5-5e242a7b8a9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.217143 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87722bcd-9eeb-4b6b-ba53-b6283053a06f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mffkh\" (UID: \"87722bcd-9eeb-4b6b-ba53-b6283053a06f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.218020 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0266e2f-59eb-4610-beef-77397d4c297f-serving-cert\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.219220 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmhlh\" (UniqueName: \"kubernetes.io/projected/3d8acc9e-4d2b-4118-9e90-d95ce539747a-kube-api-access-jmhlh\") pod \"package-server-manager-789f6589d5-2kfkj\" (UID: \"3d8acc9e-4d2b-4118-9e90-d95ce539747a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.219557 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a54d2004-e4e0-4b2c-b026-e82a6f241da7-trusted-ca\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.219747 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d8acc9e-4d2b-4118-9e90-d95ce539747a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2kfkj\" (UID: \"3d8acc9e-4d2b-4118-9e90-d95ce539747a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.226779 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-serving-cert\") pod \"route-controller-manager-6576b87f9c-rgkzd\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.231000 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a54d2004-e4e0-4b2c-b026-e82a6f241da7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.242353 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" event={"ID":"c557af36-f061-43db-9a5b-e88e8df92493","Type":"ContainerStarted","Data":"97be737a00a1122f21fde380ad762b2f0cd569b4d9265de3cb86f474d407e805"} Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.252698 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ldmk\" (UniqueName: \"kubernetes.io/projected/aef587e8-7f95-42cd-9a4f-10b1024abef0-kube-api-access-5ldmk\") pod \"machine-config-operator-74547568cd-nk8rb\" (UID: \"aef587e8-7f95-42cd-9a4f-10b1024abef0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.256590 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.259605 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.260483 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhgmn\" (UniqueName: \"kubernetes.io/projected/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-kube-api-access-zhgmn\") pod \"route-controller-manager-6576b87f9c-rgkzd\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.261203 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-bound-sa-token\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.264052 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" event={"ID":"86146cb0-0a54-4309-8345-565e2da39442","Type":"ContainerStarted","Data":"f4005abaf17d20fb4518a3ea738bcc70e639fbbe57dcd3f12f2e659b106c2073"} Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.268088 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:34 crc kubenswrapper[4851]: E1001 12:55:34.268267 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:34.768253789 +0000 UTC m=+143.113371275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.268573 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qtlzx" event={"ID":"24e14587-7a28-41eb-9184-668739c10654","Type":"ContainerStarted","Data":"cb2150c01c090ce1f08db15df3827572723cff0289bf2f42984fd5b3edc07fa9"} Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.268612 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qtlzx" event={"ID":"24e14587-7a28-41eb-9184-668739c10654","Type":"ContainerStarted","Data":"4fa0b75f70240103c6a1ca19f2877c6a557ee62631117340c44d5258ca11fd67"} Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.269209 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-socket-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.269791 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-plugins-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.269627 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-socket-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.269884 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4892ebfa-87d9-4960-9c75-59dbdacfc690-cert\") pod \"ingress-canary-ffhmh\" (UID: \"4892ebfa-87d9-4960-9c75-59dbdacfc690\") " pod="openshift-ingress-canary/ingress-canary-ffhmh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.269940 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-plugins-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.272897 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" event={"ID":"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697","Type":"ContainerStarted","Data":"961bae1a3f006d9939a8fb034aa8548d0e0257ab11c3644c60d189a4af1d9dca"} Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.273509 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q5k4\" (UniqueName: \"kubernetes.io/projected/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-kube-api-access-8q5k4\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.273596 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-mountpoint-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.273618 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.273644 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdflk\" (UniqueName: \"kubernetes.io/projected/4892ebfa-87d9-4960-9c75-59dbdacfc690-kube-api-access-tdflk\") pod \"ingress-canary-ffhmh\" (UID: \"4892ebfa-87d9-4960-9c75-59dbdacfc690\") " pod="openshift-ingress-canary/ingress-canary-ffhmh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.273692 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-registration-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.273723 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fe62c67c-9803-4d61-a882-9406db825f8a-node-bootstrap-token\") pod \"machine-config-server-f6l2s\" (UID: \"fe62c67c-9803-4d61-a882-9406db825f8a\") " pod="openshift-machine-config-operator/machine-config-server-f6l2s" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.273741 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-csi-data-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.273803 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fe62c67c-9803-4d61-a882-9406db825f8a-certs\") pod \"machine-config-server-f6l2s\" (UID: \"fe62c67c-9803-4d61-a882-9406db825f8a\") " pod="openshift-machine-config-operator/machine-config-server-f6l2s" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.273829 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8rxz\" (UniqueName: \"kubernetes.io/projected/fe62c67c-9803-4d61-a882-9406db825f8a-kube-api-access-v8rxz\") pod \"machine-config-server-f6l2s\" (UID: \"fe62c67c-9803-4d61-a882-9406db825f8a\") " pod="openshift-machine-config-operator/machine-config-server-f6l2s" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.273954 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-mountpoint-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: E1001 12:55:34.274177 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:34.774166346 +0000 UTC m=+143.119283832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.274534 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-registration-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.275344 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-csi-data-dir\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.276669 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" event={"ID":"327992e6-37f8-487b-915b-b633070aece3","Type":"ContainerStarted","Data":"f2d2208a673ae912cbbc0776ff62fa127fa1a89e064cf26b6b1aea8cbf744b8e"} Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.277718 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" event={"ID":"f6538b69-30fc-4cc0-80da-62537b61f41f","Type":"ContainerStarted","Data":"86dc48ad22416747fe5700b8f1e98828d980b5468998601d85058293390039cb"} Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.279431 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fe62c67c-9803-4d61-a882-9406db825f8a-certs\") pod \"machine-config-server-f6l2s\" (UID: \"fe62c67c-9803-4d61-a882-9406db825f8a\") " pod="openshift-machine-config-operator/machine-config-server-f6l2s" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.280006 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4892ebfa-87d9-4960-9c75-59dbdacfc690-cert\") pod \"ingress-canary-ffhmh\" (UID: \"4892ebfa-87d9-4960-9c75-59dbdacfc690\") " pod="openshift-ingress-canary/ingress-canary-ffhmh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.286747 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fe62c67c-9803-4d61-a882-9406db825f8a-node-bootstrap-token\") pod \"machine-config-server-f6l2s\" (UID: \"fe62c67c-9803-4d61-a882-9406db825f8a\") " pod="openshift-machine-config-operator/machine-config-server-f6l2s" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.288059 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.293577 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs8cj\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-kube-api-access-fs8cj\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.299104 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltnk8\" (UniqueName: \"kubernetes.io/projected/68776259-af0d-442c-b3e5-5e242a7b8a9f-kube-api-access-ltnk8\") pod \"machine-approver-56656f9798-b7tcr\" (UID: \"68776259-af0d-442c-b3e5-5e242a7b8a9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.300846 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.311646 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xjvqh"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.328267 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mss5\" (UniqueName: \"kubernetes.io/projected/c0266e2f-59eb-4610-beef-77397d4c297f-kube-api-access-6mss5\") pod \"authentication-operator-69f744f599-vxwl6\" (UID: \"c0266e2f-59eb-4610-beef-77397d4c297f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.343455 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlvh9\" (UniqueName: \"kubernetes.io/projected/91e48130-a450-4f7e-aa07-bfcaa13aae66-kube-api-access-xlvh9\") pod \"machine-config-controller-84d6567774-fv9xq\" (UID: \"91e48130-a450-4f7e-aa07-bfcaa13aae66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.369013 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqr49\" (UniqueName: \"kubernetes.io/projected/00931d70-10d2-42f3-83ff-6a25098df202-kube-api-access-dqr49\") pod \"dns-default-t26f4\" (UID: \"00931d70-10d2-42f3-83ff-6a25098df202\") " pod="openshift-dns/dns-default-t26f4" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.374492 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t26f4" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.375079 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:34 crc kubenswrapper[4851]: E1001 12:55:34.375297 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:34.875255961 +0000 UTC m=+143.220373447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.375759 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: E1001 12:55:34.380077 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:34.880064007 +0000 UTC m=+143.225181493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.403708 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1eb7909a-cc95-480b-bd8d-e1f6092f3baa-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7wzb9\" (UID: \"1eb7909a-cc95-480b-bd8d-e1f6092f3baa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.423400 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh9hv\" (UniqueName: \"kubernetes.io/projected/dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d-kube-api-access-vh9hv\") pod \"cluster-samples-operator-665b6dd947-tbg25\" (UID: \"dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.442233 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8de9ba8e-0c67-49d3-becd-c845dbba7335-bound-sa-token\") pod \"ingress-operator-5b745b69d9-72dqh\" (UID: \"8de9ba8e-0c67-49d3-becd-c845dbba7335\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.463024 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.463329 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4h96\" (UniqueName: \"kubernetes.io/projected/d8e7ea87-4895-4f48-b774-e337103d3951-kube-api-access-x4h96\") pod \"openshift-config-operator-7777fb866f-x5dht\" (UID: \"d8e7ea87-4895-4f48-b774-e337103d3951\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.467228 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bhp8c"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.469448 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.476650 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.477060 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s2qb\" (UniqueName: \"kubernetes.io/projected/45efbf96-913d-4b37-87dc-0ff84f96257a-kube-api-access-8s2qb\") pod \"migrator-59844c95c7-qzxpg\" (UID: \"45efbf96-913d-4b37-87dc-0ff84f96257a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qzxpg" Oct 01 12:55:34 crc kubenswrapper[4851]: E1001 12:55:34.477099 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:34.977081128 +0000 UTC m=+143.322198614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.482286 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.496973 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.500292 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mvb7b"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.502179 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hppgp\" (UniqueName: \"kubernetes.io/projected/a93ae38a-3f2c-42bf-9da9-d30f439a371f-kube-api-access-hppgp\") pod \"olm-operator-6b444d44fb-dq7rl\" (UID: \"a93ae38a-3f2c-42bf-9da9-d30f439a371f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.503950 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.513141 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.525795 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wnjr\" (UniqueName: \"kubernetes.io/projected/87722bcd-9eeb-4b6b-ba53-b6283053a06f-kube-api-access-2wnjr\") pod \"openshift-apiserver-operator-796bbdcf4f-mffkh\" (UID: \"87722bcd-9eeb-4b6b-ba53-b6283053a06f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.527132 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.545108 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e00479b8-ba27-494b-907d-a4a201ae4be1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jqn7q\" (UID: \"e00479b8-ba27-494b-907d-a4a201ae4be1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.547530 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.550120 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8dv98"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.559791 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88g9p\" (UniqueName: \"kubernetes.io/projected/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-kube-api-access-88g9p\") pod \"marketplace-operator-79b997595-7hnt8\" (UID: \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.562239 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wmgkn"] Oct 01 12:55:34 crc kubenswrapper[4851]: W1001 12:55:34.572474 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7ff570b_9b52_4d2f_b030_40eca1804794.slice/crio-b04785f063db4d371bad97c4608f5d77c5448f892d141a0115860e97db44cc8b WatchSource:0}: Error finding container b04785f063db4d371bad97c4608f5d77c5448f892d141a0115860e97db44cc8b: Status 404 returned error can't find the container with id b04785f063db4d371bad97c4608f5d77c5448f892d141a0115860e97db44cc8b Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.582183 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: E1001 12:55:34.582770 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:35.082759003 +0000 UTC m=+143.427876489 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.584121 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp5kp\" (UniqueName: \"kubernetes.io/projected/8de9ba8e-0c67-49d3-becd-c845dbba7335-kube-api-access-qp5kp\") pod \"ingress-operator-5b745b69d9-72dqh\" (UID: \"8de9ba8e-0c67-49d3-becd-c845dbba7335\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.589289 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.590192 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.608558 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.609380 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prg7d\" (UniqueName: \"kubernetes.io/projected/9d1f1318-7c6a-42ea-a987-c419603e81bf-kube-api-access-prg7d\") pod \"kube-storage-version-migrator-operator-b67b599dd-fclxw\" (UID: \"9d1f1318-7c6a-42ea-a987-c419603e81bf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw" Oct 01 12:55:34 crc kubenswrapper[4851]: W1001 12:55:34.615393 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd07660b5_fafb_43c6_89a0_eb5a359eb314.slice/crio-17f9845479ed39145b723eea6b329558f2d9eb0e4f0bbb46efaca1077b3e9678 WatchSource:0}: Error finding container 17f9845479ed39145b723eea6b329558f2d9eb0e4f0bbb46efaca1077b3e9678: Status 404 returned error can't find the container with id 17f9845479ed39145b723eea6b329558f2d9eb0e4f0bbb46efaca1077b3e9678 Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.618004 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnvbd\" (UniqueName: \"kubernetes.io/projected/68c1357c-a93f-4ecc-9743-2bdf982607c2-kube-api-access-rnvbd\") pod \"dns-operator-744455d44c-6ccbw\" (UID: \"68c1357c-a93f-4ecc-9743-2bdf982607c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-6ccbw" Oct 01 12:55:34 crc kubenswrapper[4851]: W1001 12:55:34.625654 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8002697d_726f_49e7_830c_c64083912932.slice/crio-68036fcc9f90e00d01214894d62f049cd5e035237830f1abf7c37dddea78b997 WatchSource:0}: Error finding container 68036fcc9f90e00d01214894d62f049cd5e035237830f1abf7c37dddea78b997: Status 404 returned error can't find the container with id 68036fcc9f90e00d01214894d62f049cd5e035237830f1abf7c37dddea78b997 Oct 01 12:55:34 crc kubenswrapper[4851]: W1001 12:55:34.633667 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf541a9af_1ddc_4f57_b6e9_2b1c5baa9953.slice/crio-f1c1c513a74857e68b5274ad89b031fdf3dfb575d07be3f1d4e84e5c44758efa WatchSource:0}: Error finding container f1c1c513a74857e68b5274ad89b031fdf3dfb575d07be3f1d4e84e5c44758efa: Status 404 returned error can't find the container with id f1c1c513a74857e68b5274ad89b031fdf3dfb575d07be3f1d4e84e5c44758efa Oct 01 12:55:34 crc kubenswrapper[4851]: W1001 12:55:34.651162 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3d024a0_38bb_4a46_8b02_3f380b51d144.slice/crio-f37b5311c375c63d0a60528fd333e78cdd8eca12bf3a52bdce803c250e6e9005 WatchSource:0}: Error finding container f37b5311c375c63d0a60528fd333e78cdd8eca12bf3a52bdce803c250e6e9005: Status 404 returned error can't find the container with id f37b5311c375c63d0a60528fd333e78cdd8eca12bf3a52bdce803c250e6e9005 Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.652769 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prtfr\" (UniqueName: \"kubernetes.io/projected/3ed2545b-85c0-4141-819a-18c221911af9-kube-api-access-prtfr\") pod \"multus-admission-controller-857f4d67dd-b76sz\" (UID: \"3ed2545b-85c0-4141-819a-18c221911af9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b76sz" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.653251 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.653567 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.664070 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.665989 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9g46\" (UniqueName: \"kubernetes.io/projected/928cc097-331d-49d7-8a9d-957e2032f941-kube-api-access-d9g46\") pod \"downloads-7954f5f757-qcsbq\" (UID: \"928cc097-331d-49d7-8a9d-957e2032f941\") " pod="openshift-console/downloads-7954f5f757-qcsbq" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.667650 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" Oct 01 12:55:34 crc kubenswrapper[4851]: W1001 12:55:34.670680 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode967d0be_0502_4857_9be0_1a6560362c34.slice/crio-4d79c3f2b428e5d98c2d7f994b61b4a69caa5e21a7b97583749d127a027e481b WatchSource:0}: Error finding container 4d79c3f2b428e5d98c2d7f994b61b4a69caa5e21a7b97583749d127a027e481b: Status 404 returned error can't find the container with id 4d79c3f2b428e5d98c2d7f994b61b4a69caa5e21a7b97583749d127a027e481b Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.683767 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:34 crc kubenswrapper[4851]: E1001 12:55:34.683899 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:35.183873869 +0000 UTC m=+143.528991355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.684111 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: E1001 12:55:34.684460 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:35.184448146 +0000 UTC m=+143.529565632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.687265 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sxw6\" (UniqueName: \"kubernetes.io/projected/a5b1d979-1dee-401d-89e9-92314369ca7d-kube-api-access-2sxw6\") pod \"catalog-operator-68c6474976-28f4n\" (UID: \"a5b1d979-1dee-401d-89e9-92314369ca7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.701788 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.702901 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q5k4\" (UniqueName: \"kubernetes.io/projected/d17b1f73-c9dd-494a-a996-d4bc6ee4aec7-kube-api-access-8q5k4\") pod \"csi-hostpathplugin-gjvsn\" (UID: \"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7\") " pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.704416 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qcsbq" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.719674 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8rxz\" (UniqueName: \"kubernetes.io/projected/fe62c67c-9803-4d61-a882-9406db825f8a-kube-api-access-v8rxz\") pod \"machine-config-server-f6l2s\" (UID: \"fe62c67c-9803-4d61-a882-9406db825f8a\") " pod="openshift-machine-config-operator/machine-config-server-f6l2s" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.726803 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.738117 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t26f4"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.749467 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdflk\" (UniqueName: \"kubernetes.io/projected/4892ebfa-87d9-4960-9c75-59dbdacfc690-kube-api-access-tdflk\") pod \"ingress-canary-ffhmh\" (UID: \"4892ebfa-87d9-4960-9c75-59dbdacfc690\") " pod="openshift-ingress-canary/ingress-canary-ffhmh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.776761 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qzxpg" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.785342 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:34 crc kubenswrapper[4851]: E1001 12:55:34.785726 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:35.285697166 +0000 UTC m=+143.630814652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.791001 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.797036 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.823745 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vxwl6"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.874762 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.875356 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.875899 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6ccbw" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.887986 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:34 crc kubenswrapper[4851]: E1001 12:55:34.888557 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:35.388544631 +0000 UTC m=+143.733662117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.904829 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.908646 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:34 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:34 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:34 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.908701 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.908821 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9"] Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.921473 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.948548 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-b76sz" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.981774 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-f6l2s" Oct 01 12:55:34 crc kubenswrapper[4851]: I1001 12:55:34.989641 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:34 crc kubenswrapper[4851]: E1001 12:55:34.990097 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:35.490082979 +0000 UTC m=+143.835200465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.033715 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ffhmh" Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.091811 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:35 crc kubenswrapper[4851]: E1001 12:55:35.092173 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:35.592140632 +0000 UTC m=+143.937258118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.149833 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh"] Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.193903 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:35 crc kubenswrapper[4851]: E1001 12:55:35.194292 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:35.694267627 +0000 UTC m=+144.039385113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.224815 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7hnt8"] Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.237117 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq"] Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.240251 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q"] Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.254040 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-qtlzx" podStartSLOduration=121.254021405 podStartE2EDuration="2m1.254021405s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:35.25275947 +0000 UTC m=+143.597876956" watchObservedRunningTime="2025-10-01 12:55:35.254021405 +0000 UTC m=+143.599138891" Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.296549 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:35 crc kubenswrapper[4851]: E1001 12:55:35.297151 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:35.797140593 +0000 UTC m=+144.142258069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.299455 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" event={"ID":"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697","Type":"ContainerStarted","Data":"a4144df0fe2deb9ceb481741206d418aa3b940b2264defaca1b39dddce8c5735"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.299477 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" event={"ID":"0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697","Type":"ContainerStarted","Data":"eb758fb8876d5bc408684a4229bfe807576b9c831ff39423ca0115deb7efbea2"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.302094 4851 generic.go:334] "Generic (PLEG): container finished" podID="327992e6-37f8-487b-915b-b633070aece3" containerID="4650d69bb77387eec4f9a4a4f455216a879ecb493a65ceae038d303de63f8665" exitCode=0 Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.302598 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" event={"ID":"327992e6-37f8-487b-915b-b633070aece3","Type":"ContainerDied","Data":"4650d69bb77387eec4f9a4a4f455216a879ecb493a65ceae038d303de63f8665"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.308240 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh" event={"ID":"87722bcd-9eeb-4b6b-ba53-b6283053a06f","Type":"ContainerStarted","Data":"68b4c46493ee9b7ff603c0cc5ca69f5d7936ecd1a27e61c50d0990e4d0c7f5c8"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.315451 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xjvqh" event={"ID":"b5b5efe6-729a-431f-b8cd-67562ec18593","Type":"ContainerStarted","Data":"4f661b194509e41fcefde57718e0f5f850d35928f579e9235d1e3621cab8ae53"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.315484 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xjvqh" event={"ID":"b5b5efe6-729a-431f-b8cd-67562ec18593","Type":"ContainerStarted","Data":"ea1ed7ee94da6c163f181cab7768b49a7f8eb8726aaeada4beba7cc0563adeae"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.320494 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bhp8c" event={"ID":"c7ff570b-9b52-4d2f-b030-40eca1804794","Type":"ContainerStarted","Data":"86b1b8f1f4f4dee4e3db12b308c35e2c5241ef2e62c93a61060961d9684c684e"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.320754 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bhp8c" event={"ID":"c7ff570b-9b52-4d2f-b030-40eca1804794","Type":"ContainerStarted","Data":"b04785f063db4d371bad97c4608f5d77c5448f892d141a0115860e97db44cc8b"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.329279 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" event={"ID":"68776259-af0d-442c-b3e5-5e242a7b8a9f","Type":"ContainerStarted","Data":"f26f859be7edc336b631842eb83ffb7c92cdfbf53d7060f1049e8b02996e8b09"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.348958 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t26f4" event={"ID":"00931d70-10d2-42f3-83ff-6a25098df202","Type":"ContainerStarted","Data":"64bd925d443affd8af28001701ac959e0efcc6675a3b85f350980d60b8276248"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.351450 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" event={"ID":"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b","Type":"ContainerStarted","Data":"c50c62c2c54ce680d974806d475bcfca2774da14bc2449cc93f21e51cede59c2"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.366183 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj" event={"ID":"3d8acc9e-4d2b-4118-9e90-d95ce539747a","Type":"ContainerStarted","Data":"60dc3332087f9afe7d43b591c803f1d4ae462b0e30a388cc7f26a8ead03057f6"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.381991 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" event={"ID":"23bc52e7-0e50-454f-87ee-b82b608ee34a","Type":"ContainerStarted","Data":"385bb429789121b154f05012ccdccfb383bb2c47dcac308f555f2dad6f87c844"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.382024 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" event={"ID":"23bc52e7-0e50-454f-87ee-b82b608ee34a","Type":"ContainerStarted","Data":"436692da6b98da799ec9e5b90a291518093ccf3763472e78b40ed653d2fbee80"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.384605 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8dv98" event={"ID":"8002697d-726f-49e7-830c-c64083912932","Type":"ContainerStarted","Data":"68036fcc9f90e00d01214894d62f049cd5e035237830f1abf7c37dddea78b997"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.385193 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" event={"ID":"f3d024a0-38bb-4a46-8b02-3f380b51d144","Type":"ContainerStarted","Data":"f37b5311c375c63d0a60528fd333e78cdd8eca12bf3a52bdce803c250e6e9005"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.387808 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4" event={"ID":"5648b4bb-f782-4788-b87a-b484fbd1f375","Type":"ContainerStarted","Data":"76ef32bb5f331630384c408c61143cd8f73f54c3131cbca828e79a0c489a0407"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.391329 4851 generic.go:334] "Generic (PLEG): container finished" podID="c557af36-f061-43db-9a5b-e88e8df92493" containerID="12f868656d5544fb953ae57676b4ff520f948efea62a181a5e45fd59e43bf0dc" exitCode=0 Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.391406 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" event={"ID":"c557af36-f061-43db-9a5b-e88e8df92493","Type":"ContainerDied","Data":"12f868656d5544fb953ae57676b4ff520f948efea62a181a5e45fd59e43bf0dc"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.393797 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" event={"ID":"c0266e2f-59eb-4610-beef-77397d4c297f","Type":"ContainerStarted","Data":"49b9b1e7fc460fd2ef255b4ae911ce0ca084f24cfdaebd6040bb1047cc170a09"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.397372 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:35 crc kubenswrapper[4851]: E1001 12:55:35.398553 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:35.898536208 +0000 UTC m=+144.243653694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.405268 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn" event={"ID":"9318518d-b104-4804-86ed-0b53f1c92c44","Type":"ContainerStarted","Data":"485cb968567c8d28217d26a6550943655a374969f31ce81078bfaaf0ce5df6e6"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.405304 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn" event={"ID":"9318518d-b104-4804-86ed-0b53f1c92c44","Type":"ContainerStarted","Data":"4bcb3c3c72082c9ef9a130a1930abc9828ed9e6731fa5ead2a2badf01acf3925"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.407512 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mvb7b" event={"ID":"d07660b5-fafb-43c6-89a0-eb5a359eb314","Type":"ContainerStarted","Data":"17f9845479ed39145b723eea6b329558f2d9eb0e4f0bbb46efaca1077b3e9678"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.409601 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gjvsn"] Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.411852 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" event={"ID":"2e0769b5-2ba5-4464-b580-ec55796e5ea1","Type":"ContainerStarted","Data":"fc060fc8743ea11419b0e064174f9eda321d9ff491671f4714b5c0c68cc3367c"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.411883 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" event={"ID":"2e0769b5-2ba5-4464-b580-ec55796e5ea1","Type":"ContainerStarted","Data":"f617c1d14446ed0b2788fc72b352d1d0bd4c6d633c6b8f32c6a6ad8b5fc927eb"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.413444 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" event={"ID":"f6538b69-30fc-4cc0-80da-62537b61f41f","Type":"ContainerStarted","Data":"31fcf8212060db1c25000d2426518f27760b791f3b2b30ab82bb76ec52d29229"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.414133 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.414767 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9" event={"ID":"1eb7909a-cc95-480b-bd8d-e1f6092f3baa","Type":"ContainerStarted","Data":"7e63d2d75beed4e25ccc28b269b4c6797a6df3aae2cc0468e5f7f4fafed34dda"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.415362 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" event={"ID":"f541a9af-1ddc-4f57-b6e9-2b1c5baa9953","Type":"ContainerStarted","Data":"f1c1c513a74857e68b5274ad89b031fdf3dfb575d07be3f1d4e84e5c44758efa"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.416257 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" event={"ID":"86146cb0-0a54-4309-8345-565e2da39442","Type":"ContainerStarted","Data":"82afd1baebccdbd41b926a7827b4079a155356586b9326a290c6cb6542e1507b"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.416803 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.417419 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" event={"ID":"e967d0be-0502-4857-9be0-1a6560362c34","Type":"ContainerStarted","Data":"4d79c3f2b428e5d98c2d7f994b61b4a69caa5e21a7b97583749d127a027e481b"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.418392 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" event={"ID":"aef587e8-7f95-42cd-9a4f-10b1024abef0","Type":"ContainerStarted","Data":"161fc0934eec1ca9d971b26b957fc812641a33aa04092fd50613a73500f5336b"} Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.426738 4851 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6z8fq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.426742 4851 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-k7rvb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.426781 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" podUID="86146cb0-0a54-4309-8345-565e2da39442" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.426802 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" podUID="f6538b69-30fc-4cc0-80da-62537b61f41f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.447532 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-x5dht"] Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.461127 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25"] Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.488802 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qcsbq"] Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.499379 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:35 crc kubenswrapper[4851]: E1001 12:55:35.501913 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.001896818 +0000 UTC m=+144.347014304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:35 crc kubenswrapper[4851]: W1001 12:55:35.524177 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode723ab1e_2b65_4236_91f4_1dbae3e4acf7.slice/crio-540075bda7660bb664eea8c4e9cfa15ece5148c66d5d40e83749872232037d54 WatchSource:0}: Error finding container 540075bda7660bb664eea8c4e9cfa15ece5148c66d5d40e83749872232037d54: Status 404 returned error can't find the container with id 540075bda7660bb664eea8c4e9cfa15ece5148c66d5d40e83749872232037d54 Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.527438 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh"] Oct 01 12:55:35 crc kubenswrapper[4851]: W1001 12:55:35.543957 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8e7ea87_4895_4f48_b774_e337103d3951.slice/crio-8cc2bf60bfed75b7eb8fa9ef3a428e9a235808aab744737fe2ef638f5ce66911 WatchSource:0}: Error finding container 8cc2bf60bfed75b7eb8fa9ef3a428e9a235808aab744737fe2ef638f5ce66911: Status 404 returned error can't find the container with id 8cc2bf60bfed75b7eb8fa9ef3a428e9a235808aab744737fe2ef638f5ce66911 Oct 01 12:55:35 crc kubenswrapper[4851]: W1001 12:55:35.547077 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod928cc097_331d_49d7_8a9d_957e2032f941.slice/crio-b54424b2441b36a71c242a068fed549a9b01e6d9f25aff13f3bb2f5313ff7985 WatchSource:0}: Error finding container b54424b2441b36a71c242a068fed549a9b01e6d9f25aff13f3bb2f5313ff7985: Status 404 returned error can't find the container with id b54424b2441b36a71c242a068fed549a9b01e6d9f25aff13f3bb2f5313ff7985 Oct 01 12:55:35 crc kubenswrapper[4851]: W1001 12:55:35.572828 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8de9ba8e_0c67_49d3_becd_c845dbba7335.slice/crio-29b8d276c64468ab8d3b17b5a87619ca01a3f6f0dd53dfd8d44b0c571e93dd87 WatchSource:0}: Error finding container 29b8d276c64468ab8d3b17b5a87619ca01a3f6f0dd53dfd8d44b0c571e93dd87: Status 404 returned error can't find the container with id 29b8d276c64468ab8d3b17b5a87619ca01a3f6f0dd53dfd8d44b0c571e93dd87 Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.599966 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:35 crc kubenswrapper[4851]: E1001 12:55:35.600267 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.100215335 +0000 UTC m=+144.445332821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.601142 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:35 crc kubenswrapper[4851]: E1001 12:55:35.609481 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.109467706 +0000 UTC m=+144.454585192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.701749 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:35 crc kubenswrapper[4851]: E1001 12:55:35.701969 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.201943388 +0000 UTC m=+144.547060884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.702203 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:35 crc kubenswrapper[4851]: E1001 12:55:35.702602 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.202587826 +0000 UTC m=+144.547705312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.714636 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qzxpg"] Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.720012 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n"] Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.722535 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl"] Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.808687 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:35 crc kubenswrapper[4851]: E1001 12:55:35.809029 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.309001523 +0000 UTC m=+144.654119009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.810311 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:35 crc kubenswrapper[4851]: E1001 12:55:35.811821 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.311810022 +0000 UTC m=+144.656927508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.813584 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-b76sz"] Oct 01 12:55:35 crc kubenswrapper[4851]: W1001 12:55:35.855402 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5b1d979_1dee_401d_89e9_92314369ca7d.slice/crio-cc17eec21edbe191c1e7643682e31a9a4c60d58d1c6bcebb2a93940842be53d7 WatchSource:0}: Error finding container cc17eec21edbe191c1e7643682e31a9a4c60d58d1c6bcebb2a93940842be53d7: Status 404 returned error can't find the container with id cc17eec21edbe191c1e7643682e31a9a4c60d58d1c6bcebb2a93940842be53d7 Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.909858 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:35 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:35 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:35 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.909914 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.920289 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:35 crc kubenswrapper[4851]: E1001 12:55:35.920421 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.42040423 +0000 UTC m=+144.765521716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.926042 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:35 crc kubenswrapper[4851]: E1001 12:55:35.926391 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.426375379 +0000 UTC m=+144.771492865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.930957 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw"] Oct 01 12:55:35 crc kubenswrapper[4851]: I1001 12:55:35.961282 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6ccbw"] Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.027710 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:36 crc kubenswrapper[4851]: E1001 12:55:36.028024 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.528000529 +0000 UTC m=+144.873118015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.028714 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:36 crc kubenswrapper[4851]: E1001 12:55:36.029182 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.529165552 +0000 UTC m=+144.874283038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.041849 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ffhmh"] Oct 01 12:55:36 crc kubenswrapper[4851]: W1001 12:55:36.044530 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d1f1318_7c6a_42ea_a987_c419603e81bf.slice/crio-a4d8fefe10e33089b6a03d0edf3c77e67d60a1bb4fd3b1e1f5a2c81abe5c1b54 WatchSource:0}: Error finding container a4d8fefe10e33089b6a03d0edf3c77e67d60a1bb4fd3b1e1f5a2c81abe5c1b54: Status 404 returned error can't find the container with id a4d8fefe10e33089b6a03d0edf3c77e67d60a1bb4fd3b1e1f5a2c81abe5c1b54 Oct 01 12:55:36 crc kubenswrapper[4851]: W1001 12:55:36.071933 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4892ebfa_87d9_4960_9c75_59dbdacfc690.slice/crio-015e98c8df0c059db9b592c981d8f50f70d1b98cd241ae353d7e4d87a14bc39b WatchSource:0}: Error finding container 015e98c8df0c059db9b592c981d8f50f70d1b98cd241ae353d7e4d87a14bc39b: Status 404 returned error can't find the container with id 015e98c8df0c059db9b592c981d8f50f70d1b98cd241ae353d7e4d87a14bc39b Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.129514 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:36 crc kubenswrapper[4851]: E1001 12:55:36.129850 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.629831476 +0000 UTC m=+144.974948962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.137142 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-68zzg" podStartSLOduration=122.137126232 podStartE2EDuration="2m2.137126232s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:36.134598831 +0000 UTC m=+144.479716327" watchObservedRunningTime="2025-10-01 12:55:36.137126232 +0000 UTC m=+144.482243718" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.231456 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:36 crc kubenswrapper[4851]: E1001 12:55:36.231803 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.731792386 +0000 UTC m=+145.076909872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.313594 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2qlz9" podStartSLOduration=123.313574667 podStartE2EDuration="2m3.313574667s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:36.279299618 +0000 UTC m=+144.624417094" watchObservedRunningTime="2025-10-01 12:55:36.313574667 +0000 UTC m=+144.658692153" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.335389 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:36 crc kubenswrapper[4851]: E1001 12:55:36.335701 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.835687661 +0000 UTC m=+145.180805137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.444396 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:36 crc kubenswrapper[4851]: E1001 12:55:36.444712 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:36.944698361 +0000 UTC m=+145.289815847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.503929 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" event={"ID":"8de9ba8e-0c67-49d3-becd-c845dbba7335","Type":"ContainerStarted","Data":"b78a5ea4ca0a87786e117ba96a19b68a0f1b530bfa92072e9bb0fbcb270f4e8a"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.504273 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" event={"ID":"8de9ba8e-0c67-49d3-becd-c845dbba7335","Type":"ContainerStarted","Data":"29b8d276c64468ab8d3b17b5a87619ca01a3f6f0dd53dfd8d44b0c571e93dd87"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.507342 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" event={"ID":"a93ae38a-3f2c-42bf-9da9-d30f439a371f","Type":"ContainerStarted","Data":"c3a801673477389f47c97b5fc9bda8647b1a35426b2eb2df65dcf68b92acf29f"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.507384 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" event={"ID":"a93ae38a-3f2c-42bf-9da9-d30f439a371f","Type":"ContainerStarted","Data":"0cd44da43267cf4ca6ac0bc9f028b673b943b85edfeabf16e613afd850c864f0"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.509073 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.511276 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qcsbq" event={"ID":"928cc097-331d-49d7-8a9d-957e2032f941","Type":"ContainerStarted","Data":"1e1ac978c6ffbc29261c2eadf852b31f1c86e77a6214c605820d93ccab023034"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.511300 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qcsbq" event={"ID":"928cc097-331d-49d7-8a9d-957e2032f941","Type":"ContainerStarted","Data":"b54424b2441b36a71c242a068fed549a9b01e6d9f25aff13f3bb2f5313ff7985"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.512288 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qcsbq" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.515061 4851 patch_prober.go:28] interesting pod/downloads-7954f5f757-qcsbq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.515096 4851 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dq7rl container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.515099 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qcsbq" podUID="928cc097-331d-49d7-8a9d-957e2032f941" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.515138 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" podUID="a93ae38a-3f2c-42bf-9da9-d30f439a371f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.517724 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8dv98" event={"ID":"8002697d-726f-49e7-830c-c64083912932","Type":"ContainerStarted","Data":"923f614fbe207990beb854e5ee0918dca5e085ca5f9e0cb0b5b85ea29bf1e6c8"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.537179 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh" event={"ID":"87722bcd-9eeb-4b6b-ba53-b6283053a06f","Type":"ContainerStarted","Data":"a68d3ac0673d4342871ec5d16a22c44298f91215cfdea9869aab8d5d1810f5b3"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.538409 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" podStartSLOduration=123.538396677 podStartE2EDuration="2m3.538396677s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:36.536823453 +0000 UTC m=+144.881940939" watchObservedRunningTime="2025-10-01 12:55:36.538396677 +0000 UTC m=+144.883514163" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.544999 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:36 crc kubenswrapper[4851]: E1001 12:55:36.546319 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:37.046302581 +0000 UTC m=+145.391420127 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.548715 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ffhmh" event={"ID":"4892ebfa-87d9-4960-9c75-59dbdacfc690","Type":"ContainerStarted","Data":"015e98c8df0c059db9b592c981d8f50f70d1b98cd241ae353d7e4d87a14bc39b"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.549859 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-b76sz" event={"ID":"3ed2545b-85c0-4141-819a-18c221911af9","Type":"ContainerStarted","Data":"507c0d507c7ce10c46de064f63164471fe53fb098c241bbd7ff03ee8de99cf25"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.554445 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" event={"ID":"91e48130-a450-4f7e-aa07-bfcaa13aae66","Type":"ContainerStarted","Data":"954ef3613cb8f767ba762ad02536c505d446cc4d4a3fcc32f086b4e2080e07e3"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.554469 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" event={"ID":"91e48130-a450-4f7e-aa07-bfcaa13aae66","Type":"ContainerStarted","Data":"7e555e47e270da6d1652f2060377eb1404f6f0d3fb99a781ae15c2d14327622e"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.566444 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" event={"ID":"e967d0be-0502-4857-9be0-1a6560362c34","Type":"ContainerStarted","Data":"c0f270cc1df6554341b8170a62beef40eaee6c78d0d3064a284e0ce83aa469aa"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.568146 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.575865 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" event={"ID":"d8e7ea87-4895-4f48-b774-e337103d3951","Type":"ContainerStarted","Data":"68ba37ffe3a9c3de489c1a9e976e1b5e9ee26485d6ed314376163ca3a351f48e"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.575903 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" event={"ID":"d8e7ea87-4895-4f48-b774-e337103d3951","Type":"ContainerStarted","Data":"8cc2bf60bfed75b7eb8fa9ef3a428e9a235808aab744737fe2ef638f5ce66911"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.592697 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" event={"ID":"f3d024a0-38bb-4a46-8b02-3f380b51d144","Type":"ContainerStarted","Data":"4867ff866d62df55f384f4fa5075955ae807192b087575c95ea276562578f52f"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.594346 4851 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-cqwct container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:5443/healthz\": dial tcp 10.217.0.24:5443: connect: connection refused" start-of-body= Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.594379 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" podUID="e967d0be-0502-4857-9be0-1a6560362c34" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.24:5443/healthz\": dial tcp 10.217.0.24:5443: connect: connection refused" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.601457 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bhp8c" podStartSLOduration=122.601439418 podStartE2EDuration="2m2.601439418s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:36.599067921 +0000 UTC m=+144.944185407" watchObservedRunningTime="2025-10-01 12:55:36.601439418 +0000 UTC m=+144.946556904" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.601760 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6ccbw" event={"ID":"68c1357c-a93f-4ecc-9743-2bdf982607c2","Type":"ContainerStarted","Data":"8072e48f5a32977b95c8756ac2e2ad2381f1c1cfd6b72a70c7e85d283eab23f0"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.644827 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" event={"ID":"aef587e8-7f95-42cd-9a4f-10b1024abef0","Type":"ContainerStarted","Data":"fe134c4428917f58ed414a73a9f805b741ca0cf52f02adc496a529c6a1372c04"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.646575 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:36 crc kubenswrapper[4851]: E1001 12:55:36.650702 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:37.150654239 +0000 UTC m=+145.495771925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.677634 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4" event={"ID":"5648b4bb-f782-4788-b87a-b484fbd1f375","Type":"ContainerStarted","Data":"a0300599ad68448142a643ed716b8f5ed786ddfc59a6c9db6cf9eea59b5a885f"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.684262 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" podStartSLOduration=122.684241337 podStartE2EDuration="2m2.684241337s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:36.655874246 +0000 UTC m=+145.000991732" watchObservedRunningTime="2025-10-01 12:55:36.684241337 +0000 UTC m=+145.029358823" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.684545 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-f6l2s" event={"ID":"fe62c67c-9803-4d61-a882-9406db825f8a","Type":"ContainerStarted","Data":"346b6738a2c676677e9a94f522848ed14ff36f4c71406b30c2dbbb3bbfdf3b3d"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.684590 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-f6l2s" event={"ID":"fe62c67c-9803-4d61-a882-9406db825f8a","Type":"ContainerStarted","Data":"c4b835a569eb63657c547a3133dbee8938210e09520f30c68e91ae8368b0f57e"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.708232 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw" event={"ID":"9d1f1318-7c6a-42ea-a987-c419603e81bf","Type":"ContainerStarted","Data":"a4d8fefe10e33089b6a03d0edf3c77e67d60a1bb4fd3b1e1f5a2c81abe5c1b54"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.733662 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" event={"ID":"a5b1d979-1dee-401d-89e9-92314369ca7d","Type":"ContainerStarted","Data":"53d96d0f1076201be3cb78f79d6daf4a7ca28da8ab421d4e5e795e6485f53c90"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.734020 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" event={"ID":"a5b1d979-1dee-401d-89e9-92314369ca7d","Type":"ContainerStarted","Data":"cc17eec21edbe191c1e7643682e31a9a4c60d58d1c6bcebb2a93940842be53d7"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.735868 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.737143 4851 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-28f4n container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.737191 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" podUID="a5b1d979-1dee-401d-89e9-92314369ca7d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.737899 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9" event={"ID":"1eb7909a-cc95-480b-bd8d-e1f6092f3baa","Type":"ContainerStarted","Data":"44cb2423dba55e0fe8b639386294df5d9520ade63ad92887e1049829bc4a1205"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.749149 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:36 crc kubenswrapper[4851]: E1001 12:55:36.749264 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:37.249249414 +0000 UTC m=+145.594366890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.749544 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:36 crc kubenswrapper[4851]: E1001 12:55:36.750819 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:37.250811688 +0000 UTC m=+145.595929164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.763793 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t26f4" event={"ID":"00931d70-10d2-42f3-83ff-6a25098df202","Type":"ContainerStarted","Data":"4d6ce0075888f1cc962f2249e36ff895b60ad0f988c3fb7941e47f3187251bbc"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.774170 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mvb7b" event={"ID":"d07660b5-fafb-43c6-89a0-eb5a359eb314","Type":"ContainerStarted","Data":"fe5d4b7dcc174d80eb935202f23fa774a74e9b5930af3dd32bd6ca8c16eb34f4"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.774556 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.775466 4851 patch_prober.go:28] interesting pod/console-operator-58897d9998-mvb7b container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.775492 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-mvb7b" podUID="d07660b5-fafb-43c6-89a0-eb5a359eb314" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.792533 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" event={"ID":"68776259-af0d-442c-b3e5-5e242a7b8a9f","Type":"ContainerStarted","Data":"6dad1de34e275f8c6c2e8b8d99662c4dc4a3e47d17f62be0140656ae01b44d15"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.812348 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q" event={"ID":"e00479b8-ba27-494b-907d-a4a201ae4be1","Type":"ContainerStarted","Data":"1b7c22757fe592fc3c125494dd2c6489bff250a0b78ff19e41661b4f007ff203"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.812390 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q" event={"ID":"e00479b8-ba27-494b-907d-a4a201ae4be1","Type":"ContainerStarted","Data":"517a7bff11f64c0236a2fa04cf042e3425d8bc77f510694c279bc769124bd6eb"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.819866 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" event={"ID":"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b","Type":"ContainerStarted","Data":"02ea592b1b4b4189c8ed849e2d29c1a8f0e0d9c2341a6bcc56e43c45e0937d41"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.820322 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.833453 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" event={"ID":"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7","Type":"ContainerStarted","Data":"862ff629abfd60968c18d9376a9a000270fb889eafae7482d6b952ff97a1e5f9"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.833780 4851 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-rgkzd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.833814 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" podUID="c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.835313 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" event={"ID":"e723ab1e-2b65-4236-91f4-1dbae3e4acf7","Type":"ContainerStarted","Data":"e207030b670b6f4efff60bd08a6a9a867c5484b60b7df19f3e2fcb4b31df99ec"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.835335 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" event={"ID":"e723ab1e-2b65-4236-91f4-1dbae3e4acf7","Type":"ContainerStarted","Data":"540075bda7660bb664eea8c4e9cfa15ece5148c66d5d40e83749872232037d54"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.835740 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.836672 4851 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7hnt8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.836714 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" podUID="e723ab1e-2b65-4236-91f4-1dbae3e4acf7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.838164 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj" event={"ID":"3d8acc9e-4d2b-4118-9e90-d95ce539747a","Type":"ContainerStarted","Data":"114ccccb6c28a2777d5798273eab754a14fa526795e269d85be49317f7001dfb"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.838454 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.859405 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:36 crc kubenswrapper[4851]: E1001 12:55:36.862392 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:37.362373159 +0000 UTC m=+145.707490645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.864748 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qzxpg" event={"ID":"45efbf96-913d-4b37-87dc-0ff84f96257a","Type":"ContainerStarted","Data":"b4f7736cf3207846ba3ad8d159cec4f5f0e80aaded14c8580d01496e36a6d078"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.872364 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" event={"ID":"f541a9af-1ddc-4f57-b6e9-2b1c5baa9953","Type":"ContainerStarted","Data":"3a1dff0bfa9785574ccbd4ef007f0130e091d1e8ceb34e1757711d556dc835a8"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.888216 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25" event={"ID":"dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d","Type":"ContainerStarted","Data":"f93b13709360e243463b20405cc5dbf0be5ce959c0ecdabae14daddbdbb25e1d"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.891838 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" event={"ID":"c0266e2f-59eb-4610-beef-77397d4c297f","Type":"ContainerStarted","Data":"71f6567062ef7718797d628a9793b3eea1f094d4b37cdb05d7f5cef5a84aacee"} Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.896876 4851 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6z8fq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.896913 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" podUID="86146cb0-0a54-4309-8345-565e2da39442" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.907059 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.910625 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:36 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:36 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:36 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.910658 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.938623 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-xjvqh" podStartSLOduration=123.938606233 podStartE2EDuration="2m3.938606233s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:36.910919351 +0000 UTC m=+145.256036837" watchObservedRunningTime="2025-10-01 12:55:36.938606233 +0000 UTC m=+145.283723719" Oct 01 12:55:36 crc kubenswrapper[4851]: I1001 12:55:36.962816 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:36 crc kubenswrapper[4851]: E1001 12:55:36.973510 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:37.473481208 +0000 UTC m=+145.818598694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.064603 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:37 crc kubenswrapper[4851]: E1001 12:55:37.065392 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:37.565366414 +0000 UTC m=+145.910483900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.111387 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" podStartSLOduration=124.111372803 podStartE2EDuration="2m4.111372803s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:36.971317077 +0000 UTC m=+145.316434563" watchObservedRunningTime="2025-10-01 12:55:37.111372803 +0000 UTC m=+145.456490289" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.169084 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:37 crc kubenswrapper[4851]: E1001 12:55:37.169391 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:37.669376092 +0000 UTC m=+146.014493578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.228164 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6p6zn" podStartSLOduration=124.228150062 podStartE2EDuration="2m4.228150062s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.226812724 +0000 UTC m=+145.571930230" watchObservedRunningTime="2025-10-01 12:55:37.228150062 +0000 UTC m=+145.573267548" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.269981 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:37 crc kubenswrapper[4851]: E1001 12:55:37.270759 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:37.770727915 +0000 UTC m=+146.115845401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.302966 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" podStartSLOduration=123.302951685 podStartE2EDuration="2m3.302951685s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.301786082 +0000 UTC m=+145.646903568" watchObservedRunningTime="2025-10-01 12:55:37.302951685 +0000 UTC m=+145.648069171" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.371421 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:37 crc kubenswrapper[4851]: E1001 12:55:37.371770 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:37.871756908 +0000 UTC m=+146.216874394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.395028 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c9zt4" podStartSLOduration=123.395014345 podStartE2EDuration="2m3.395014345s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.369224016 +0000 UTC m=+145.714341502" watchObservedRunningTime="2025-10-01 12:55:37.395014345 +0000 UTC m=+145.740131831" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.395632 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw" podStartSLOduration=123.395625572 podStartE2EDuration="2m3.395625572s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.394243953 +0000 UTC m=+145.739361449" watchObservedRunningTime="2025-10-01 12:55:37.395625572 +0000 UTC m=+145.740743058" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.453999 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" podStartSLOduration=123.453982861 podStartE2EDuration="2m3.453982861s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.452278682 +0000 UTC m=+145.797396168" watchObservedRunningTime="2025-10-01 12:55:37.453982861 +0000 UTC m=+145.799100347" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.479091 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:37 crc kubenswrapper[4851]: E1001 12:55:37.479477 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:37.97946319 +0000 UTC m=+146.324580676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.535955 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" podStartSLOduration=123.535925025 podStartE2EDuration="2m3.535925025s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.527543409 +0000 UTC m=+145.872660895" watchObservedRunningTime="2025-10-01 12:55:37.535925025 +0000 UTC m=+145.881042511" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.557881 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jqn7q" podStartSLOduration=123.557865795 podStartE2EDuration="2m3.557865795s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.556545398 +0000 UTC m=+145.901662894" watchObservedRunningTime="2025-10-01 12:55:37.557865795 +0000 UTC m=+145.902983281" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.582696 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:37 crc kubenswrapper[4851]: E1001 12:55:37.583098 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:38.083083578 +0000 UTC m=+146.428201064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.610609 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" podStartSLOduration=123.610596415 podStartE2EDuration="2m3.610596415s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.609313479 +0000 UTC m=+145.954430965" watchObservedRunningTime="2025-10-01 12:55:37.610596415 +0000 UTC m=+145.955713901" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.687071 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:37 crc kubenswrapper[4851]: E1001 12:55:37.687925 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:38.187900729 +0000 UTC m=+146.533018215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.703969 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-mvb7b" podStartSLOduration=124.703949102 podStartE2EDuration="2m4.703949102s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.678701359 +0000 UTC m=+146.023818845" watchObservedRunningTime="2025-10-01 12:55:37.703949102 +0000 UTC m=+146.049066588" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.735418 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-f6l2s" podStartSLOduration=6.7353981 podStartE2EDuration="6.7353981s" podCreationTimestamp="2025-10-01 12:55:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.702871211 +0000 UTC m=+146.047988697" watchObservedRunningTime="2025-10-01 12:55:37.7353981 +0000 UTC m=+146.080515586" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.763421 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qcsbq" podStartSLOduration=124.763403151 podStartE2EDuration="2m4.763403151s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.737398137 +0000 UTC m=+146.082515623" watchObservedRunningTime="2025-10-01 12:55:37.763403151 +0000 UTC m=+146.108520637" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.765219 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lmpr6" podStartSLOduration=123.765212902 podStartE2EDuration="2m3.765212902s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.76407409 +0000 UTC m=+146.109191576" watchObservedRunningTime="2025-10-01 12:55:37.765212902 +0000 UTC m=+146.110330388" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.792185 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:37 crc kubenswrapper[4851]: E1001 12:55:37.792480 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:38.292468553 +0000 UTC m=+146.637586039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.797560 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7wzb9" podStartSLOduration=123.797547406 podStartE2EDuration="2m3.797547406s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.795354694 +0000 UTC m=+146.140472180" watchObservedRunningTime="2025-10-01 12:55:37.797547406 +0000 UTC m=+146.142664892" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.892398 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-vxwl6" podStartSLOduration=124.892381875 podStartE2EDuration="2m4.892381875s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.891533071 +0000 UTC m=+146.236650557" watchObservedRunningTime="2025-10-01 12:55:37.892381875 +0000 UTC m=+146.237499361" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.894066 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.894110 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj" podStartSLOduration=123.894102584 podStartE2EDuration="2m3.894102584s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.855622046 +0000 UTC m=+146.200739532" watchObservedRunningTime="2025-10-01 12:55:37.894102584 +0000 UTC m=+146.239220070" Oct 01 12:55:37 crc kubenswrapper[4851]: E1001 12:55:37.894449 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:38.394436573 +0000 UTC m=+146.739554059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.911320 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:37 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:37 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:37 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.911373 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.918448 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fclxw" event={"ID":"9d1f1318-7c6a-42ea-a987-c419603e81bf","Type":"ContainerStarted","Data":"e537772e1154bee6b853e071365dca7bf15b1be670384145fb4f0e91ed8b4ba5"} Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.921036 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-8dv98" podStartSLOduration=123.921025054 podStartE2EDuration="2m3.921025054s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.920732816 +0000 UTC m=+146.265850302" watchObservedRunningTime="2025-10-01 12:55:37.921025054 +0000 UTC m=+146.266142540" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.930484 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qzxpg" event={"ID":"45efbf96-913d-4b37-87dc-0ff84f96257a","Type":"ContainerStarted","Data":"781e21668f2615549308f7ba8ebad9dc40f4924942ab56dddf029db38aae1e30"} Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.930540 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qzxpg" event={"ID":"45efbf96-913d-4b37-87dc-0ff84f96257a","Type":"ContainerStarted","Data":"fba572927f9087db2c596ecf8490979db629fa2b2fb497b92adee0e7c10706a6"} Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.940958 4851 generic.go:334] "Generic (PLEG): container finished" podID="d8e7ea87-4895-4f48-b774-e337103d3951" containerID="68ba37ffe3a9c3de489c1a9e976e1b5e9ee26485d6ed314376163ca3a351f48e" exitCode=0 Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.941016 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" event={"ID":"d8e7ea87-4895-4f48-b774-e337103d3951","Type":"ContainerDied","Data":"68ba37ffe3a9c3de489c1a9e976e1b5e9ee26485d6ed314376163ca3a351f48e"} Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.941039 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" event={"ID":"d8e7ea87-4895-4f48-b774-e337103d3951","Type":"ContainerStarted","Data":"70c89cbf2c5d0451842f6af5e42a541be97f886c8260e974c824e18879293335"} Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.941531 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.979596 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-b76sz" event={"ID":"3ed2545b-85c0-4141-819a-18c221911af9","Type":"ContainerStarted","Data":"d404d0860fe72c01185e95b2f87f2e4543abb674ed7bf31e7a1a8c8ff6b078c3"} Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.985101 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-wmgkn" podStartSLOduration=123.985082154 podStartE2EDuration="2m3.985082154s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.969516134 +0000 UTC m=+146.314633620" watchObservedRunningTime="2025-10-01 12:55:37.985082154 +0000 UTC m=+146.330199640" Oct 01 12:55:37 crc kubenswrapper[4851]: I1001 12:55:37.995194 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:37 crc kubenswrapper[4851]: E1001 12:55:37.996976 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:38.496963919 +0000 UTC m=+146.842081405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.014341 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" event={"ID":"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7","Type":"ContainerStarted","Data":"682594714bad90a9f4751a0426847a1e29a029699865fa81a99647a1d03fafe7"} Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.043781 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" podStartSLOduration=124.043766401 podStartE2EDuration="2m4.043766401s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.04337743 +0000 UTC m=+146.388494916" watchObservedRunningTime="2025-10-01 12:55:38.043766401 +0000 UTC m=+146.388883887" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.049791 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t26f4" event={"ID":"00931d70-10d2-42f3-83ff-6a25098df202","Type":"ContainerStarted","Data":"a1a8130a1f87929d43d02123e890a0362f9434076f60020651bdec7c509c0f23"} Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.050360 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-t26f4" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.093323 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" event={"ID":"c557af36-f061-43db-9a5b-e88e8df92493","Type":"ContainerStarted","Data":"c9b78f6b6d671fec40018e08da5f8ba191fc2b51ad86c885b10b17908c10a745"} Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.093372 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" event={"ID":"c557af36-f061-43db-9a5b-e88e8df92493","Type":"ContainerStarted","Data":"b46b76d419e823375fb18663928eeac6e1a366a7c9747b8714474cf74b78f1ce"} Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.097796 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:38 crc kubenswrapper[4851]: E1001 12:55:38.098280 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:38.598262741 +0000 UTC m=+146.943380227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.110035 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mffkh" podStartSLOduration=125.110020273 podStartE2EDuration="2m5.110020273s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.108518241 +0000 UTC m=+146.453635737" watchObservedRunningTime="2025-10-01 12:55:38.110020273 +0000 UTC m=+146.455137759" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.124012 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ffhmh" event={"ID":"4892ebfa-87d9-4960-9c75-59dbdacfc690","Type":"ContainerStarted","Data":"c23c1a6020224d4efc7f565184c344a69b701e702c95289ffd0c9a15748fde1b"} Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.150863 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" podStartSLOduration=124.150846116 podStartE2EDuration="2m4.150846116s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.144703653 +0000 UTC m=+146.489821149" watchObservedRunningTime="2025-10-01 12:55:38.150846116 +0000 UTC m=+146.495963602" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.156628 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" event={"ID":"91e48130-a450-4f7e-aa07-bfcaa13aae66","Type":"ContainerStarted","Data":"4b3db36636429ee484acf3c3db7a68c8ebbb94d9ac34ea26fa8bad7df1576537"} Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.181655 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" event={"ID":"327992e6-37f8-487b-915b-b633070aece3","Type":"ContainerStarted","Data":"eb641c59a0f1e0e4b93c7ed36918c63a674887188a44e4fe776050b3d11b6cf7"} Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.190114 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qzxpg" podStartSLOduration=124.190096985 podStartE2EDuration="2m4.190096985s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.185600168 +0000 UTC m=+146.530717644" watchObservedRunningTime="2025-10-01 12:55:38.190096985 +0000 UTC m=+146.535214471" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.201121 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nk8rb" event={"ID":"aef587e8-7f95-42cd-9a4f-10b1024abef0","Type":"ContainerStarted","Data":"27f68114d7fdf9acc1eea7dd62abc3b68718976e5d2d3c109f013279a4e5aa1c"} Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.204785 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:38 crc kubenswrapper[4851]: E1001 12:55:38.206733 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:38.706720295 +0000 UTC m=+147.051837781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.229253 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6ccbw" event={"ID":"68c1357c-a93f-4ecc-9743-2bdf982607c2","Type":"ContainerStarted","Data":"02986f346d442b55c462b60b182fa0f3dcaddbfaf9d197f815b52642904c0890"} Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.236919 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25" event={"ID":"dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d","Type":"ContainerStarted","Data":"a8f21bd73cb7080aac30d32698a0e770b66dff1e83ea95bca6e8fccb87c7d075"} Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.236963 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25" event={"ID":"dbd5b9f8-88ba-4b49-86b0-ce9de90aa30d","Type":"ContainerStarted","Data":"e67eaa4b269a1e0a0c2f5e62e139ce88c45d9623a402df9de311512086948507"} Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.244888 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" event={"ID":"68776259-af0d-442c-b3e5-5e242a7b8a9f","Type":"ContainerStarted","Data":"364e36697caf1d8168816ad6eaabddfd4340f3546a695549da82fdad228dde6f"} Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.250835 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj" event={"ID":"3d8acc9e-4d2b-4118-9e90-d95ce539747a","Type":"ContainerStarted","Data":"b8ac545b9b679c26afe0f7cbc671783261df61e21645270fdba5cb0a954ebd74"} Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.280799 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" event={"ID":"8de9ba8e-0c67-49d3-becd-c845dbba7335","Type":"ContainerStarted","Data":"a5b4cb63c27b45e2538af40bb331e6da1c178c9f76e11b262f6ac88eb58bdc5c"} Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.281126 4851 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dq7rl container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.281196 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" podUID="a93ae38a-3f2c-42bf-9da9-d30f439a371f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.288314 4851 patch_prober.go:28] interesting pod/downloads-7954f5f757-qcsbq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.288377 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qcsbq" podUID="928cc097-331d-49d7-8a9d-957e2032f941" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.301039 4851 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7hnt8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.301095 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" podUID="e723ab1e-2b65-4236-91f4-1dbae3e4acf7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.301328 4851 patch_prober.go:28] interesting pod/console-operator-58897d9998-mvb7b container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.301375 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-mvb7b" podUID="d07660b5-fafb-43c6-89a0-eb5a359eb314" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.305925 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:38 crc kubenswrapper[4851]: E1001 12:55:38.306215 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:38.806199575 +0000 UTC m=+147.151317061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.306995 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:38 crc kubenswrapper[4851]: E1001 12:55:38.312991 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:38.812977096 +0000 UTC m=+147.158094582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.338016 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ffhmh" podStartSLOduration=7.338000343 podStartE2EDuration="7.338000343s" podCreationTimestamp="2025-10-01 12:55:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.33222491 +0000 UTC m=+146.677342396" watchObservedRunningTime="2025-10-01 12:55:38.338000343 +0000 UTC m=+146.683117829" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.340093 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" podStartSLOduration=125.340085002 podStartE2EDuration="2m5.340085002s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.279381497 +0000 UTC m=+146.624498983" watchObservedRunningTime="2025-10-01 12:55:38.340085002 +0000 UTC m=+146.685202488" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.366378 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-28f4n" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.373189 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.388369 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" podStartSLOduration=125.388351856 podStartE2EDuration="2m5.388351856s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.385408742 +0000 UTC m=+146.730526248" watchObservedRunningTime="2025-10-01 12:55:38.388351856 +0000 UTC m=+146.733469342" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.409933 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:38 crc kubenswrapper[4851]: E1001 12:55:38.410191 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:38.910170142 +0000 UTC m=+147.255287628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.455186 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-t26f4" podStartSLOduration=7.455169773 podStartE2EDuration="7.455169773s" podCreationTimestamp="2025-10-01 12:55:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.453468095 +0000 UTC m=+146.798585571" watchObservedRunningTime="2025-10-01 12:55:38.455169773 +0000 UTC m=+146.800287259" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.500765 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fv9xq" podStartSLOduration=124.500749961 podStartE2EDuration="2m4.500749961s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.498798675 +0000 UTC m=+146.843916161" watchObservedRunningTime="2025-10-01 12:55:38.500749961 +0000 UTC m=+146.845867447" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.512195 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:38 crc kubenswrapper[4851]: E1001 12:55:38.512559 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:39.012543904 +0000 UTC m=+147.357661390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.552933 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b7tcr" podStartSLOduration=125.552919174 podStartE2EDuration="2m5.552919174s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.550881127 +0000 UTC m=+146.895998613" watchObservedRunningTime="2025-10-01 12:55:38.552919174 +0000 UTC m=+146.898036660" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.591302 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tbg25" podStartSLOduration=125.591273998 podStartE2EDuration="2m5.591273998s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.588116589 +0000 UTC m=+146.933234075" watchObservedRunningTime="2025-10-01 12:55:38.591273998 +0000 UTC m=+146.936391484" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.623892 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:38 crc kubenswrapper[4851]: E1001 12:55:38.624240 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:39.124224939 +0000 UTC m=+147.469342425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.625631 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" podStartSLOduration=124.625619078 podStartE2EDuration="2m4.625619078s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.62463284 +0000 UTC m=+146.969750326" watchObservedRunningTime="2025-10-01 12:55:38.625619078 +0000 UTC m=+146.970736564" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.628115 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.628450 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.636894 4851 patch_prober.go:28] interesting pod/apiserver-76f77b778f-v2v8j container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.636944 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" podUID="c557af36-f061-43db-9a5b-e88e8df92493" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.660786 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-72dqh" podStartSLOduration=124.660765191 podStartE2EDuration="2m4.660765191s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.65472137 +0000 UTC m=+146.999838856" watchObservedRunningTime="2025-10-01 12:55:38.660765191 +0000 UTC m=+147.005882677" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.661795 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.662122 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.728337 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:38 crc kubenswrapper[4851]: E1001 12:55:38.729076 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:39.22906166 +0000 UTC m=+147.574179146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.831116 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:38 crc kubenswrapper[4851]: E1001 12:55:38.831405 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:39.331390181 +0000 UTC m=+147.676507667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.871159 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.900730 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6ccbw" podStartSLOduration=124.900711489 podStartE2EDuration="2m4.900711489s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.777289853 +0000 UTC m=+147.122407339" watchObservedRunningTime="2025-10-01 12:55:38.900711489 +0000 UTC m=+147.245828975" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.910644 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:38 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:38 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:38 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.910696 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:38 crc kubenswrapper[4851]: I1001 12:55:38.933750 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:38 crc kubenswrapper[4851]: E1001 12:55:38.934026 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:39.43401516 +0000 UTC m=+147.779132646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.034337 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:39 crc kubenswrapper[4851]: E1001 12:55:39.034699 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:39.534675003 +0000 UTC m=+147.879792479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.034832 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:39 crc kubenswrapper[4851]: E1001 12:55:39.035143 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:39.535136216 +0000 UTC m=+147.880253692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.135567 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:39 crc kubenswrapper[4851]: E1001 12:55:39.135865 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:39.635849311 +0000 UTC m=+147.980966797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.236264 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:39 crc kubenswrapper[4851]: E1001 12:55:39.236591 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:39.736579437 +0000 UTC m=+148.081696923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.285474 4851 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-cqwct container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.285533 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" podUID="e967d0be-0502-4857-9be0-1a6560362c34" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.24:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.286370 4851 generic.go:334] "Generic (PLEG): container finished" podID="23bc52e7-0e50-454f-87ee-b82b608ee34a" containerID="385bb429789121b154f05012ccdccfb383bb2c47dcac308f555f2dad6f87c844" exitCode=0 Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.286440 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" event={"ID":"23bc52e7-0e50-454f-87ee-b82b608ee34a","Type":"ContainerDied","Data":"385bb429789121b154f05012ccdccfb383bb2c47dcac308f555f2dad6f87c844"} Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.288304 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6ccbw" event={"ID":"68c1357c-a93f-4ecc-9743-2bdf982607c2","Type":"ContainerStarted","Data":"a16369cd1997b462eaf2fe42ec315946ee8ff57b30e9f8b168af09b6c5232bd1"} Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.290208 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-b76sz" event={"ID":"3ed2545b-85c0-4141-819a-18c221911af9","Type":"ContainerStarted","Data":"2ba4f4711000656a0e0840574d7bcec825be19822ea56e32d0215d7ecb150b3f"} Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.313929 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.337833 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:39 crc kubenswrapper[4851]: E1001 12:55:39.338883 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:39.838870907 +0000 UTC m=+148.183988393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.369561 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-b76sz" podStartSLOduration=125.369538563 podStartE2EDuration="2m5.369538563s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:39.36200029 +0000 UTC m=+147.707117776" watchObservedRunningTime="2025-10-01 12:55:39.369538563 +0000 UTC m=+147.714656069" Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.439072 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:39 crc kubenswrapper[4851]: E1001 12:55:39.439417 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:39.939402236 +0000 UTC m=+148.284519722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.449150 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqwct" Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.539554 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:39 crc kubenswrapper[4851]: E1001 12:55:39.539981 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.039960267 +0000 UTC m=+148.385077743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.641078 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:39 crc kubenswrapper[4851]: E1001 12:55:39.641394 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.141379692 +0000 UTC m=+148.486497178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.742319 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:39 crc kubenswrapper[4851]: E1001 12:55:39.742479 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.242456457 +0000 UTC m=+148.587573943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.742694 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:39 crc kubenswrapper[4851]: E1001 12:55:39.743000 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.242989872 +0000 UTC m=+148.588107348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.843284 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:39 crc kubenswrapper[4851]: E1001 12:55:39.843462 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.34343683 +0000 UTC m=+148.688554316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.843597 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:39 crc kubenswrapper[4851]: E1001 12:55:39.843903 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.343890793 +0000 UTC m=+148.689008279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.908654 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:39 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:39 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:39 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.908713 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.917378 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8b48t"] Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.918237 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.927830 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.944525 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:39 crc kubenswrapper[4851]: E1001 12:55:39.944830 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.444613008 +0000 UTC m=+148.789730494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.944953 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9867a8d1-19de-4887-a5da-5d13588544c0-utilities\") pod \"certified-operators-8b48t\" (UID: \"9867a8d1-19de-4887-a5da-5d13588544c0\") " pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.944995 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9867a8d1-19de-4887-a5da-5d13588544c0-catalog-content\") pod \"certified-operators-8b48t\" (UID: \"9867a8d1-19de-4887-a5da-5d13588544c0\") " pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.945077 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.945109 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txm9r\" (UniqueName: \"kubernetes.io/projected/9867a8d1-19de-4887-a5da-5d13588544c0-kube-api-access-txm9r\") pod \"certified-operators-8b48t\" (UID: \"9867a8d1-19de-4887-a5da-5d13588544c0\") " pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:55:39 crc kubenswrapper[4851]: E1001 12:55:39.945316 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.445309098 +0000 UTC m=+148.790426584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:39 crc kubenswrapper[4851]: I1001 12:55:39.969157 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8b48t"] Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.048879 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:40 crc kubenswrapper[4851]: E1001 12:55:40.049033 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.549009177 +0000 UTC m=+148.894126663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.049398 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9867a8d1-19de-4887-a5da-5d13588544c0-catalog-content\") pod \"certified-operators-8b48t\" (UID: \"9867a8d1-19de-4887-a5da-5d13588544c0\") " pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.049480 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.049519 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txm9r\" (UniqueName: \"kubernetes.io/projected/9867a8d1-19de-4887-a5da-5d13588544c0-kube-api-access-txm9r\") pod \"certified-operators-8b48t\" (UID: \"9867a8d1-19de-4887-a5da-5d13588544c0\") " pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.049563 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9867a8d1-19de-4887-a5da-5d13588544c0-utilities\") pod \"certified-operators-8b48t\" (UID: \"9867a8d1-19de-4887-a5da-5d13588544c0\") " pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.050475 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9867a8d1-19de-4887-a5da-5d13588544c0-utilities\") pod \"certified-operators-8b48t\" (UID: \"9867a8d1-19de-4887-a5da-5d13588544c0\") " pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.050768 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9867a8d1-19de-4887-a5da-5d13588544c0-catalog-content\") pod \"certified-operators-8b48t\" (UID: \"9867a8d1-19de-4887-a5da-5d13588544c0\") " pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:55:40 crc kubenswrapper[4851]: E1001 12:55:40.050889 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.55087676 +0000 UTC m=+148.895994246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.087546 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xvh4p"] Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.096663 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.100903 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.128525 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xvh4p"] Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.149980 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.150131 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8f6b65-9192-41e4-80dd-81688a0714a8-catalog-content\") pod \"community-operators-xvh4p\" (UID: \"ed8f6b65-9192-41e4-80dd-81688a0714a8\") " pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.150202 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv5vk\" (UniqueName: \"kubernetes.io/projected/ed8f6b65-9192-41e4-80dd-81688a0714a8-kube-api-access-vv5vk\") pod \"community-operators-xvh4p\" (UID: \"ed8f6b65-9192-41e4-80dd-81688a0714a8\") " pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.150224 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8f6b65-9192-41e4-80dd-81688a0714a8-utilities\") pod \"community-operators-xvh4p\" (UID: \"ed8f6b65-9192-41e4-80dd-81688a0714a8\") " pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:55:40 crc kubenswrapper[4851]: E1001 12:55:40.150402 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.650388341 +0000 UTC m=+148.995505827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.152651 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txm9r\" (UniqueName: \"kubernetes.io/projected/9867a8d1-19de-4887-a5da-5d13588544c0-kube-api-access-txm9r\") pod \"certified-operators-8b48t\" (UID: \"9867a8d1-19de-4887-a5da-5d13588544c0\") " pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.229896 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.253620 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8f6b65-9192-41e4-80dd-81688a0714a8-catalog-content\") pod \"community-operators-xvh4p\" (UID: \"ed8f6b65-9192-41e4-80dd-81688a0714a8\") " pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.253688 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.253714 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv5vk\" (UniqueName: \"kubernetes.io/projected/ed8f6b65-9192-41e4-80dd-81688a0714a8-kube-api-access-vv5vk\") pod \"community-operators-xvh4p\" (UID: \"ed8f6b65-9192-41e4-80dd-81688a0714a8\") " pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.253734 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8f6b65-9192-41e4-80dd-81688a0714a8-utilities\") pod \"community-operators-xvh4p\" (UID: \"ed8f6b65-9192-41e4-80dd-81688a0714a8\") " pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:55:40 crc kubenswrapper[4851]: E1001 12:55:40.254016 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.753998948 +0000 UTC m=+149.099116434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.254119 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8f6b65-9192-41e4-80dd-81688a0714a8-utilities\") pod \"community-operators-xvh4p\" (UID: \"ed8f6b65-9192-41e4-80dd-81688a0714a8\") " pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.254409 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8f6b65-9192-41e4-80dd-81688a0714a8-catalog-content\") pod \"community-operators-xvh4p\" (UID: \"ed8f6b65-9192-41e4-80dd-81688a0714a8\") " pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.283866 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b64mk"] Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.284783 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.306530 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b64mk"] Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.307769 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv5vk\" (UniqueName: \"kubernetes.io/projected/ed8f6b65-9192-41e4-80dd-81688a0714a8-kube-api-access-vv5vk\") pod \"community-operators-xvh4p\" (UID: \"ed8f6b65-9192-41e4-80dd-81688a0714a8\") " pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.354617 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:40 crc kubenswrapper[4851]: E1001 12:55:40.354748 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.854727733 +0000 UTC m=+149.199845219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.354798 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.354829 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.354848 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.354872 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.354910 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.354954 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db44d49-9188-4af0-baa3-b612c4a83846-utilities\") pod \"certified-operators-b64mk\" (UID: \"7db44d49-9188-4af0-baa3-b612c4a83846\") " pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.354981 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6mnj\" (UniqueName: \"kubernetes.io/projected/7db44d49-9188-4af0-baa3-b612c4a83846-kube-api-access-p6mnj\") pod \"certified-operators-b64mk\" (UID: \"7db44d49-9188-4af0-baa3-b612c4a83846\") " pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.355003 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db44d49-9188-4af0-baa3-b612c4a83846-catalog-content\") pod \"certified-operators-b64mk\" (UID: \"7db44d49-9188-4af0-baa3-b612c4a83846\") " pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:55:40 crc kubenswrapper[4851]: E1001 12:55:40.357901 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.857886962 +0000 UTC m=+149.203004448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.359001 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.362867 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.364978 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.367149 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.369595 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" event={"ID":"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7","Type":"ContainerStarted","Data":"4770223957ae38faf74e5ede75ee0061c1a8c84293b2f9c8fe555018db6a0edf"} Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.369638 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" event={"ID":"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7","Type":"ContainerStarted","Data":"e7393b0436127f2312ad7acf2c77cc953b278016d8c71795984dbea37ed0389c"} Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.376415 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwhr7" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.419083 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.450854 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.456008 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:40 crc kubenswrapper[4851]: E1001 12:55:40.456158 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.956136118 +0000 UTC m=+149.301253604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.456294 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.456356 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db44d49-9188-4af0-baa3-b612c4a83846-utilities\") pod \"certified-operators-b64mk\" (UID: \"7db44d49-9188-4af0-baa3-b612c4a83846\") " pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.456411 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6mnj\" (UniqueName: \"kubernetes.io/projected/7db44d49-9188-4af0-baa3-b612c4a83846-kube-api-access-p6mnj\") pod \"certified-operators-b64mk\" (UID: \"7db44d49-9188-4af0-baa3-b612c4a83846\") " pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.456436 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db44d49-9188-4af0-baa3-b612c4a83846-catalog-content\") pod \"certified-operators-b64mk\" (UID: \"7db44d49-9188-4af0-baa3-b612c4a83846\") " pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.457623 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db44d49-9188-4af0-baa3-b612c4a83846-catalog-content\") pod \"certified-operators-b64mk\" (UID: \"7db44d49-9188-4af0-baa3-b612c4a83846\") " pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:55:40 crc kubenswrapper[4851]: E1001 12:55:40.458077 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:40.958060792 +0000 UTC m=+149.303178278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.463297 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db44d49-9188-4af0-baa3-b612c4a83846-utilities\") pod \"certified-operators-b64mk\" (UID: \"7db44d49-9188-4af0-baa3-b612c4a83846\") " pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.463551 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.468012 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.498661 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x5dht" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.499392 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6mnj\" (UniqueName: \"kubernetes.io/projected/7db44d49-9188-4af0-baa3-b612c4a83846-kube-api-access-p6mnj\") pod \"certified-operators-b64mk\" (UID: \"7db44d49-9188-4af0-baa3-b612c4a83846\") " pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.507905 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2fngg"] Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.508825 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.547288 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2fngg"] Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.557996 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:40 crc kubenswrapper[4851]: E1001 12:55:40.559259 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:41.059244801 +0000 UTC m=+149.404362287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.642322 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.664712 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lczlr\" (UniqueName: \"kubernetes.io/projected/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-kube-api-access-lczlr\") pod \"community-operators-2fngg\" (UID: \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\") " pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.667530 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-catalog-content\") pod \"community-operators-2fngg\" (UID: \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\") " pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.667699 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.667746 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-utilities\") pod \"community-operators-2fngg\" (UID: \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\") " pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:55:40 crc kubenswrapper[4851]: E1001 12:55:40.668030 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:41.168019543 +0000 UTC m=+149.513137029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.776883 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.777009 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lczlr\" (UniqueName: \"kubernetes.io/projected/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-kube-api-access-lczlr\") pod \"community-operators-2fngg\" (UID: \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\") " pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.777041 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-catalog-content\") pod \"community-operators-2fngg\" (UID: \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\") " pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.777086 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-utilities\") pod \"community-operators-2fngg\" (UID: \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\") " pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:55:40 crc kubenswrapper[4851]: E1001 12:55:40.778164 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:41.278129364 +0000 UTC m=+149.623246850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.778192 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-utilities\") pod \"community-operators-2fngg\" (UID: \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\") " pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.778547 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-catalog-content\") pod \"community-operators-2fngg\" (UID: \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\") " pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.833361 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lczlr\" (UniqueName: \"kubernetes.io/projected/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-kube-api-access-lczlr\") pod \"community-operators-2fngg\" (UID: \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\") " pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.853010 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.877968 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:40 crc kubenswrapper[4851]: E1001 12:55:40.878301 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:41.378286653 +0000 UTC m=+149.723404139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.893516 4851 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.955998 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:40 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:40 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:40 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.956419 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.974883 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:40 crc kubenswrapper[4851]: I1001 12:55:40.979542 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:40 crc kubenswrapper[4851]: E1001 12:55:40.979913 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:41.479897913 +0000 UTC m=+149.825015399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.076224 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8b48t"] Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.080350 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzxg8\" (UniqueName: \"kubernetes.io/projected/23bc52e7-0e50-454f-87ee-b82b608ee34a-kube-api-access-zzxg8\") pod \"23bc52e7-0e50-454f-87ee-b82b608ee34a\" (UID: \"23bc52e7-0e50-454f-87ee-b82b608ee34a\") " Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.080423 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23bc52e7-0e50-454f-87ee-b82b608ee34a-config-volume\") pod \"23bc52e7-0e50-454f-87ee-b82b608ee34a\" (UID: \"23bc52e7-0e50-454f-87ee-b82b608ee34a\") " Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.080569 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23bc52e7-0e50-454f-87ee-b82b608ee34a-secret-volume\") pod \"23bc52e7-0e50-454f-87ee-b82b608ee34a\" (UID: \"23bc52e7-0e50-454f-87ee-b82b608ee34a\") " Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.080761 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:41 crc kubenswrapper[4851]: E1001 12:55:41.081037 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:41.581024579 +0000 UTC m=+149.926142065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.081539 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23bc52e7-0e50-454f-87ee-b82b608ee34a-config-volume" (OuterVolumeSpecName: "config-volume") pod "23bc52e7-0e50-454f-87ee-b82b608ee34a" (UID: "23bc52e7-0e50-454f-87ee-b82b608ee34a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.085770 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23bc52e7-0e50-454f-87ee-b82b608ee34a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "23bc52e7-0e50-454f-87ee-b82b608ee34a" (UID: "23bc52e7-0e50-454f-87ee-b82b608ee34a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.086385 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23bc52e7-0e50-454f-87ee-b82b608ee34a-kube-api-access-zzxg8" (OuterVolumeSpecName: "kube-api-access-zzxg8") pod "23bc52e7-0e50-454f-87ee-b82b608ee34a" (UID: "23bc52e7-0e50-454f-87ee-b82b608ee34a"). InnerVolumeSpecName "kube-api-access-zzxg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:41 crc kubenswrapper[4851]: W1001 12:55:41.090132 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9867a8d1_19de_4887_a5da_5d13588544c0.slice/crio-b9b612cbec5a285599ef58df8c5ba3edb5d94490b29f8b3182e929d1861a32b9 WatchSource:0}: Error finding container b9b612cbec5a285599ef58df8c5ba3edb5d94490b29f8b3182e929d1861a32b9: Status 404 returned error can't find the container with id b9b612cbec5a285599ef58df8c5ba3edb5d94490b29f8b3182e929d1861a32b9 Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.181454 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.181828 4851 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23bc52e7-0e50-454f-87ee-b82b608ee34a-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.181845 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzxg8\" (UniqueName: \"kubernetes.io/projected/23bc52e7-0e50-454f-87ee-b82b608ee34a-kube-api-access-zzxg8\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.181854 4851 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23bc52e7-0e50-454f-87ee-b82b608ee34a-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:41 crc kubenswrapper[4851]: E1001 12:55:41.181915 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:41.681900539 +0000 UTC m=+150.027018025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.232804 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 12:55:41 crc kubenswrapper[4851]: E1001 12:55:41.233262 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23bc52e7-0e50-454f-87ee-b82b608ee34a" containerName="collect-profiles" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.233273 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="23bc52e7-0e50-454f-87ee-b82b608ee34a" containerName="collect-profiles" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.233362 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="23bc52e7-0e50-454f-87ee-b82b608ee34a" containerName="collect-profiles" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.234675 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.236638 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.236835 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.252076 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.278847 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b64mk"] Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.282365 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2fngg"] Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.283130 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ad71044-3b99-4ece-a9c2-82e9d072e0cd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3ad71044-3b99-4ece-a9c2-82e9d072e0cd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.283182 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ad71044-3b99-4ece-a9c2-82e9d072e0cd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3ad71044-3b99-4ece-a9c2-82e9d072e0cd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.283219 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:41 crc kubenswrapper[4851]: E1001 12:55:41.283446 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:41.783435067 +0000 UTC m=+150.128552543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:41 crc kubenswrapper[4851]: W1001 12:55:41.343802 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-93fcc9da4fe712af547560678a66e2597e139947c65837d0a024f0d48f09957f WatchSource:0}: Error finding container 93fcc9da4fe712af547560678a66e2597e139947c65837d0a024f0d48f09957f: Status 404 returned error can't find the container with id 93fcc9da4fe712af547560678a66e2597e139947c65837d0a024f0d48f09957f Oct 01 12:55:41 crc kubenswrapper[4851]: W1001 12:55:41.344644 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7db44d49_9188_4af0_baa3_b612c4a83846.slice/crio-97d9daf8ac84367da2be08d2b03fae377d7b6240c5213e85791b9520e8eee116 WatchSource:0}: Error finding container 97d9daf8ac84367da2be08d2b03fae377d7b6240c5213e85791b9520e8eee116: Status 404 returned error can't find the container with id 97d9daf8ac84367da2be08d2b03fae377d7b6240c5213e85791b9520e8eee116 Oct 01 12:55:41 crc kubenswrapper[4851]: W1001 12:55:41.345706 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95f957df_ff36_44c7_b2fc_5d2bf10c87fa.slice/crio-309fdd69d72b01012fa76c2d696bc8fc0c53c1e38626e762902ed817b659cfac WatchSource:0}: Error finding container 309fdd69d72b01012fa76c2d696bc8fc0c53c1e38626e762902ed817b659cfac: Status 404 returned error can't find the container with id 309fdd69d72b01012fa76c2d696bc8fc0c53c1e38626e762902ed817b659cfac Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.378458 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xvh4p"] Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.381340 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b64mk" event={"ID":"7db44d49-9188-4af0-baa3-b612c4a83846","Type":"ContainerStarted","Data":"97d9daf8ac84367da2be08d2b03fae377d7b6240c5213e85791b9520e8eee116"} Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.383816 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.384035 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ad71044-3b99-4ece-a9c2-82e9d072e0cd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3ad71044-3b99-4ece-a9c2-82e9d072e0cd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.384079 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ad71044-3b99-4ece-a9c2-82e9d072e0cd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3ad71044-3b99-4ece-a9c2-82e9d072e0cd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.384159 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ad71044-3b99-4ece-a9c2-82e9d072e0cd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3ad71044-3b99-4ece-a9c2-82e9d072e0cd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:55:41 crc kubenswrapper[4851]: E1001 12:55:41.384225 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:55:41.884210594 +0000 UTC m=+150.229328070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.385021 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fngg" event={"ID":"95f957df-ff36-44c7-b2fc-5d2bf10c87fa","Type":"ContainerStarted","Data":"309fdd69d72b01012fa76c2d696bc8fc0c53c1e38626e762902ed817b659cfac"} Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.387858 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" event={"ID":"23bc52e7-0e50-454f-87ee-b82b608ee34a","Type":"ContainerDied","Data":"436692da6b98da799ec9e5b90a291518093ccf3763472e78b40ed653d2fbee80"} Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.387886 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="436692da6b98da799ec9e5b90a291518093ccf3763472e78b40ed653d2fbee80" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.387955 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.389606 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"93fcc9da4fe712af547560678a66e2597e139947c65837d0a024f0d48f09957f"} Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.390883 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ac1bed5e973cc7895f625c02ae6b170ad3d6fb124d4315263dca2061a34c92c2"} Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.394219 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" event={"ID":"d17b1f73-c9dd-494a-a996-d4bc6ee4aec7","Type":"ContainerStarted","Data":"b39afd91f48b6772516b045b5d648c89a98ed0008df734c0a3573f857fedcdfe"} Oct 01 12:55:41 crc kubenswrapper[4851]: W1001 12:55:41.395989 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-9f8039203279650b1ed361b2cb8a17f6e8ae9abc1a70063cb2c06cd555c66742 WatchSource:0}: Error finding container 9f8039203279650b1ed361b2cb8a17f6e8ae9abc1a70063cb2c06cd555c66742: Status 404 returned error can't find the container with id 9f8039203279650b1ed361b2cb8a17f6e8ae9abc1a70063cb2c06cd555c66742 Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.396147 4851 generic.go:334] "Generic (PLEG): container finished" podID="9867a8d1-19de-4887-a5da-5d13588544c0" containerID="89179ee4f4ea0d8f7c4f8a781b87d4712e51266ddf4e8de081498d5a7ce8c826" exitCode=0 Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.396175 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8b48t" event={"ID":"9867a8d1-19de-4887-a5da-5d13588544c0","Type":"ContainerDied","Data":"89179ee4f4ea0d8f7c4f8a781b87d4712e51266ddf4e8de081498d5a7ce8c826"} Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.396206 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8b48t" event={"ID":"9867a8d1-19de-4887-a5da-5d13588544c0","Type":"ContainerStarted","Data":"b9b612cbec5a285599ef58df8c5ba3edb5d94490b29f8b3182e929d1861a32b9"} Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.403654 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.406250 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ad71044-3b99-4ece-a9c2-82e9d072e0cd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3ad71044-3b99-4ece-a9c2-82e9d072e0cd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.419001 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-gjvsn" podStartSLOduration=10.418976866 podStartE2EDuration="10.418976866s" podCreationTimestamp="2025-10-01 12:55:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:41.409487488 +0000 UTC m=+149.754604974" watchObservedRunningTime="2025-10-01 12:55:41.418976866 +0000 UTC m=+149.764094352" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.485044 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:41 crc kubenswrapper[4851]: E1001 12:55:41.486050 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:55:41.986032 +0000 UTC m=+150.331149556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6s9sc" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.505069 4851 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-01T12:55:40.893543954Z","Handler":null,"Name":""} Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.523893 4851 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.524246 4851 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.558021 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.585900 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.589276 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.686743 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.689732 4851 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.689778 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.741466 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.754329 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6s9sc\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:41 crc kubenswrapper[4851]: W1001 12:55:41.766620 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3ad71044_3b99_4ece_a9c2_82e9d072e0cd.slice/crio-cb30bce360fd6b03b8500f087f84bfdb8cd2d0a7971de1fff0bb3104e9a718c7 WatchSource:0}: Error finding container cb30bce360fd6b03b8500f087f84bfdb8cd2d0a7971de1fff0bb3104e9a718c7: Status 404 returned error can't find the container with id cb30bce360fd6b03b8500f087f84bfdb8cd2d0a7971de1fff0bb3104e9a718c7 Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.871872 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-msbpg"] Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.872817 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.875070 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.882508 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-msbpg"] Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.908046 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:41 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:41 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:41 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.908088 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.989065 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgkjw\" (UniqueName: \"kubernetes.io/projected/04430c41-7121-4c5d-803b-db06d0dd7237-kube-api-access-bgkjw\") pod \"redhat-marketplace-msbpg\" (UID: \"04430c41-7121-4c5d-803b-db06d0dd7237\") " pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.989106 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04430c41-7121-4c5d-803b-db06d0dd7237-catalog-content\") pod \"redhat-marketplace-msbpg\" (UID: \"04430c41-7121-4c5d-803b-db06d0dd7237\") " pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:55:41 crc kubenswrapper[4851]: I1001 12:55:41.989141 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04430c41-7121-4c5d-803b-db06d0dd7237-utilities\") pod \"redhat-marketplace-msbpg\" (UID: \"04430c41-7121-4c5d-803b-db06d0dd7237\") " pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.019801 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.119191 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgkjw\" (UniqueName: \"kubernetes.io/projected/04430c41-7121-4c5d-803b-db06d0dd7237-kube-api-access-bgkjw\") pod \"redhat-marketplace-msbpg\" (UID: \"04430c41-7121-4c5d-803b-db06d0dd7237\") " pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.119265 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04430c41-7121-4c5d-803b-db06d0dd7237-catalog-content\") pod \"redhat-marketplace-msbpg\" (UID: \"04430c41-7121-4c5d-803b-db06d0dd7237\") " pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.119346 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04430c41-7121-4c5d-803b-db06d0dd7237-utilities\") pod \"redhat-marketplace-msbpg\" (UID: \"04430c41-7121-4c5d-803b-db06d0dd7237\") " pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.120123 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04430c41-7121-4c5d-803b-db06d0dd7237-catalog-content\") pod \"redhat-marketplace-msbpg\" (UID: \"04430c41-7121-4c5d-803b-db06d0dd7237\") " pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.120150 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04430c41-7121-4c5d-803b-db06d0dd7237-utilities\") pod \"redhat-marketplace-msbpg\" (UID: \"04430c41-7121-4c5d-803b-db06d0dd7237\") " pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.147359 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgkjw\" (UniqueName: \"kubernetes.io/projected/04430c41-7121-4c5d-803b-db06d0dd7237-kube-api-access-bgkjw\") pod \"redhat-marketplace-msbpg\" (UID: \"04430c41-7121-4c5d-803b-db06d0dd7237\") " pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.269658 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6s9sc"] Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.275527 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-56b5n"] Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.284762 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.294663 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.302028 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-56b5n"] Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.348136 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.425167 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjbmw\" (UniqueName: \"kubernetes.io/projected/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-kube-api-access-rjbmw\") pod \"redhat-marketplace-56b5n\" (UID: \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\") " pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.425209 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-catalog-content\") pod \"redhat-marketplace-56b5n\" (UID: \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\") " pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.425239 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-utilities\") pod \"redhat-marketplace-56b5n\" (UID: \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\") " pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.427696 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ad71044-3b99-4ece-a9c2-82e9d072e0cd","Type":"ContainerStarted","Data":"30d801484dea86aa38b62209118cce0f7d4c37929ee400f16a41f0f0f0e8665e"} Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.427722 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ad71044-3b99-4ece-a9c2-82e9d072e0cd","Type":"ContainerStarted","Data":"cb30bce360fd6b03b8500f087f84bfdb8cd2d0a7971de1fff0bb3104e9a718c7"} Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.431340 4851 generic.go:334] "Generic (PLEG): container finished" podID="ed8f6b65-9192-41e4-80dd-81688a0714a8" containerID="3d236a2b04cfc4baa6a5dabee673aa329077f504020635bbf5b31013e213dd8f" exitCode=0 Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.431402 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvh4p" event={"ID":"ed8f6b65-9192-41e4-80dd-81688a0714a8","Type":"ContainerDied","Data":"3d236a2b04cfc4baa6a5dabee673aa329077f504020635bbf5b31013e213dd8f"} Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.431429 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvh4p" event={"ID":"ed8f6b65-9192-41e4-80dd-81688a0714a8","Type":"ContainerStarted","Data":"98badfbd244239b72d57bafb87c60490d2047d9ac56c52018f0988a13bb74627"} Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.435435 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3b158507162d3ed9ea9a99b95b281c064c6e82d485e1412624886af8f9775d58"} Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.441400 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"bd95ece59f3689631d892fe2139280ccf90ccf36e8be45ad79876209ff23a859"} Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.442028 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.447090 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ab70e28c1f2ec87d759a923e9af3c7bc1748b6ad0077ee1549fa6561d048d263"} Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.447181 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9f8039203279650b1ed361b2cb8a17f6e8ae9abc1a70063cb2c06cd555c66742"} Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.464804 4851 generic.go:334] "Generic (PLEG): container finished" podID="7db44d49-9188-4af0-baa3-b612c4a83846" containerID="97e3a3a06a38e8d628418eed82ee158cd79489615f07940cd4792af3fd527721" exitCode=0 Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.464899 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b64mk" event={"ID":"7db44d49-9188-4af0-baa3-b612c4a83846","Type":"ContainerDied","Data":"97e3a3a06a38e8d628418eed82ee158cd79489615f07940cd4792af3fd527721"} Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.473921 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" event={"ID":"a54d2004-e4e0-4b2c-b026-e82a6f241da7","Type":"ContainerStarted","Data":"3b07dcc93d3937aa9db3944a9750d8ca78685d29d82fe3027fdec3d58d0fcdc1"} Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.474745 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.478258 4851 generic.go:334] "Generic (PLEG): container finished" podID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" containerID="4ef204ce078aa4bed8adaac846e664e5b218d50f2707d2e095480bc754915d76" exitCode=0 Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.479472 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fngg" event={"ID":"95f957df-ff36-44c7-b2fc-5d2bf10c87fa","Type":"ContainerDied","Data":"4ef204ce078aa4bed8adaac846e664e5b218d50f2707d2e095480bc754915d76"} Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.528101 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-catalog-content\") pod \"redhat-marketplace-56b5n\" (UID: \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\") " pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.528186 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-utilities\") pod \"redhat-marketplace-56b5n\" (UID: \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\") " pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.528283 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjbmw\" (UniqueName: \"kubernetes.io/projected/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-kube-api-access-rjbmw\") pod \"redhat-marketplace-56b5n\" (UID: \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\") " pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.530831 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-utilities\") pod \"redhat-marketplace-56b5n\" (UID: \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\") " pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.530831 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-catalog-content\") pod \"redhat-marketplace-56b5n\" (UID: \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\") " pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.547344 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjbmw\" (UniqueName: \"kubernetes.io/projected/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-kube-api-access-rjbmw\") pod \"redhat-marketplace-56b5n\" (UID: \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\") " pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.555303 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-msbpg"] Oct 01 12:55:42 crc kubenswrapper[4851]: W1001 12:55:42.577653 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04430c41_7121_4c5d_803b_db06d0dd7237.slice/crio-e2cd39b564ed0181016a806b62c8792fcf575f2deeaca3da1e69c555814a70c4 WatchSource:0}: Error finding container e2cd39b564ed0181016a806b62c8792fcf575f2deeaca3da1e69c555814a70c4: Status 404 returned error can't find the container with id e2cd39b564ed0181016a806b62c8792fcf575f2deeaca3da1e69c555814a70c4 Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.607894 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.656315 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.656300108 podStartE2EDuration="1.656300108s" podCreationTimestamp="2025-10-01 12:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:42.651102371 +0000 UTC m=+150.996219857" watchObservedRunningTime="2025-10-01 12:55:42.656300108 +0000 UTC m=+151.001417584" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.728034 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" podStartSLOduration=129.728017684 podStartE2EDuration="2m9.728017684s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:42.725614436 +0000 UTC m=+151.070731922" watchObservedRunningTime="2025-10-01 12:55:42.728017684 +0000 UTC m=+151.073135170" Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.854883 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-56b5n"] Oct 01 12:55:42 crc kubenswrapper[4851]: W1001 12:55:42.875062 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d8661b3_299a_4c38_95a6_4216fcb6cd3a.slice/crio-922c8270e2d6232249cfcd26160dbf0a40f2d32da7e4190a07baaeb564aeab24 WatchSource:0}: Error finding container 922c8270e2d6232249cfcd26160dbf0a40f2d32da7e4190a07baaeb564aeab24: Status 404 returned error can't find the container with id 922c8270e2d6232249cfcd26160dbf0a40f2d32da7e4190a07baaeb564aeab24 Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.911705 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:42 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:42 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:42 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:42 crc kubenswrapper[4851]: I1001 12:55:42.911964 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.095531 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.096303 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.100513 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.100779 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.100943 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.236216 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c27542f4-64a4-4173-a6d3-e985304f1630-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c27542f4-64a4-4173-a6d3-e985304f1630\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.236254 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c27542f4-64a4-4173-a6d3-e985304f1630-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c27542f4-64a4-4173-a6d3-e985304f1630\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.273034 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-97jrf"] Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.274043 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.275872 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.290085 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-97jrf"] Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.336795 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c27542f4-64a4-4173-a6d3-e985304f1630-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c27542f4-64a4-4173-a6d3-e985304f1630\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.336902 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c27542f4-64a4-4173-a6d3-e985304f1630-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c27542f4-64a4-4173-a6d3-e985304f1630\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.337028 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c27542f4-64a4-4173-a6d3-e985304f1630-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c27542f4-64a4-4173-a6d3-e985304f1630\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.356756 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c27542f4-64a4-4173-a6d3-e985304f1630-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c27542f4-64a4-4173-a6d3-e985304f1630\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.437746 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0390df89-e5c5-4684-8ef5-3006aed29cd6-catalog-content\") pod \"redhat-operators-97jrf\" (UID: \"0390df89-e5c5-4684-8ef5-3006aed29cd6\") " pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.437806 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76qf9\" (UniqueName: \"kubernetes.io/projected/0390df89-e5c5-4684-8ef5-3006aed29cd6-kube-api-access-76qf9\") pod \"redhat-operators-97jrf\" (UID: \"0390df89-e5c5-4684-8ef5-3006aed29cd6\") " pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.437903 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0390df89-e5c5-4684-8ef5-3006aed29cd6-utilities\") pod \"redhat-operators-97jrf\" (UID: \"0390df89-e5c5-4684-8ef5-3006aed29cd6\") " pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.448013 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.490766 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" event={"ID":"a54d2004-e4e0-4b2c-b026-e82a6f241da7","Type":"ContainerStarted","Data":"2e277d222ffa15dacb49fb9a061fb45830e0d13f76753162da71703d9e045a9b"} Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.493085 4851 generic.go:334] "Generic (PLEG): container finished" podID="3ad71044-3b99-4ece-a9c2-82e9d072e0cd" containerID="30d801484dea86aa38b62209118cce0f7d4c37929ee400f16a41f0f0f0e8665e" exitCode=0 Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.493529 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ad71044-3b99-4ece-a9c2-82e9d072e0cd","Type":"ContainerDied","Data":"30d801484dea86aa38b62209118cce0f7d4c37929ee400f16a41f0f0f0e8665e"} Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.497973 4851 generic.go:334] "Generic (PLEG): container finished" podID="7d8661b3-299a-4c38-95a6-4216fcb6cd3a" containerID="702bd40760aa55b98f9c57edbc524055443a79bd6bbcb6d65fd7b04d1a2cb6f9" exitCode=0 Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.498069 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56b5n" event={"ID":"7d8661b3-299a-4c38-95a6-4216fcb6cd3a","Type":"ContainerDied","Data":"702bd40760aa55b98f9c57edbc524055443a79bd6bbcb6d65fd7b04d1a2cb6f9"} Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.498099 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56b5n" event={"ID":"7d8661b3-299a-4c38-95a6-4216fcb6cd3a","Type":"ContainerStarted","Data":"922c8270e2d6232249cfcd26160dbf0a40f2d32da7e4190a07baaeb564aeab24"} Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.501106 4851 generic.go:334] "Generic (PLEG): container finished" podID="04430c41-7121-4c5d-803b-db06d0dd7237" containerID="1ef475d8ef0f9ef1b28d83cb0fff8ad86b4eb47775afdc528533ae3bf9aae00e" exitCode=0 Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.501390 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msbpg" event={"ID":"04430c41-7121-4c5d-803b-db06d0dd7237","Type":"ContainerDied","Data":"1ef475d8ef0f9ef1b28d83cb0fff8ad86b4eb47775afdc528533ae3bf9aae00e"} Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.501421 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msbpg" event={"ID":"04430c41-7121-4c5d-803b-db06d0dd7237","Type":"ContainerStarted","Data":"e2cd39b564ed0181016a806b62c8792fcf575f2deeaca3da1e69c555814a70c4"} Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.539425 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0390df89-e5c5-4684-8ef5-3006aed29cd6-catalog-content\") pod \"redhat-operators-97jrf\" (UID: \"0390df89-e5c5-4684-8ef5-3006aed29cd6\") " pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.539474 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76qf9\" (UniqueName: \"kubernetes.io/projected/0390df89-e5c5-4684-8ef5-3006aed29cd6-kube-api-access-76qf9\") pod \"redhat-operators-97jrf\" (UID: \"0390df89-e5c5-4684-8ef5-3006aed29cd6\") " pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.539513 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0390df89-e5c5-4684-8ef5-3006aed29cd6-utilities\") pod \"redhat-operators-97jrf\" (UID: \"0390df89-e5c5-4684-8ef5-3006aed29cd6\") " pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.540283 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0390df89-e5c5-4684-8ef5-3006aed29cd6-catalog-content\") pod \"redhat-operators-97jrf\" (UID: \"0390df89-e5c5-4684-8ef5-3006aed29cd6\") " pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.540759 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0390df89-e5c5-4684-8ef5-3006aed29cd6-utilities\") pod \"redhat-operators-97jrf\" (UID: \"0390df89-e5c5-4684-8ef5-3006aed29cd6\") " pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.557126 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76qf9\" (UniqueName: \"kubernetes.io/projected/0390df89-e5c5-4684-8ef5-3006aed29cd6-kube-api-access-76qf9\") pod \"redhat-operators-97jrf\" (UID: \"0390df89-e5c5-4684-8ef5-3006aed29cd6\") " pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.629763 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.635640 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.637197 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-v2v8j" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.686100 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7mmcp"] Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.687191 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.701361 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7mmcp"] Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.717975 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.730788 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.734302 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.778999 4851 patch_prober.go:28] interesting pod/console-f9d7485db-xjvqh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.782114 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-xjvqh" podUID="b5b5efe6-729a-431f-b8cd-67562ec18593" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.851861 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebac54f4-17ba-4305-aa02-a6676cd35731-utilities\") pod \"redhat-operators-7mmcp\" (UID: \"ebac54f4-17ba-4305-aa02-a6676cd35731\") " pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.852669 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nc9v\" (UniqueName: \"kubernetes.io/projected/ebac54f4-17ba-4305-aa02-a6676cd35731-kube-api-access-7nc9v\") pod \"redhat-operators-7mmcp\" (UID: \"ebac54f4-17ba-4305-aa02-a6676cd35731\") " pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.852723 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebac54f4-17ba-4305-aa02-a6676cd35731-catalog-content\") pod \"redhat-operators-7mmcp\" (UID: \"ebac54f4-17ba-4305-aa02-a6676cd35731\") " pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.905868 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.908466 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:43 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:43 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:43 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.908494 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.954524 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebac54f4-17ba-4305-aa02-a6676cd35731-utilities\") pod \"redhat-operators-7mmcp\" (UID: \"ebac54f4-17ba-4305-aa02-a6676cd35731\") " pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.954606 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nc9v\" (UniqueName: \"kubernetes.io/projected/ebac54f4-17ba-4305-aa02-a6676cd35731-kube-api-access-7nc9v\") pod \"redhat-operators-7mmcp\" (UID: \"ebac54f4-17ba-4305-aa02-a6676cd35731\") " pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.954625 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebac54f4-17ba-4305-aa02-a6676cd35731-catalog-content\") pod \"redhat-operators-7mmcp\" (UID: \"ebac54f4-17ba-4305-aa02-a6676cd35731\") " pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.955377 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebac54f4-17ba-4305-aa02-a6676cd35731-utilities\") pod \"redhat-operators-7mmcp\" (UID: \"ebac54f4-17ba-4305-aa02-a6676cd35731\") " pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.956234 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebac54f4-17ba-4305-aa02-a6676cd35731-catalog-content\") pod \"redhat-operators-7mmcp\" (UID: \"ebac54f4-17ba-4305-aa02-a6676cd35731\") " pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.962669 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-mvb7b" Oct 01 12:55:43 crc kubenswrapper[4851]: I1001 12:55:43.982027 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nc9v\" (UniqueName: \"kubernetes.io/projected/ebac54f4-17ba-4305-aa02-a6676cd35731-kube-api-access-7nc9v\") pod \"redhat-operators-7mmcp\" (UID: \"ebac54f4-17ba-4305-aa02-a6676cd35731\") " pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.058084 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-97jrf"] Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.087667 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.530141 4851 generic.go:334] "Generic (PLEG): container finished" podID="0390df89-e5c5-4684-8ef5-3006aed29cd6" containerID="3085d853ca175611b856f9dfc8f0cd377bcc4a07f744db31fff05cfe986c217c" exitCode=0 Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.530226 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97jrf" event={"ID":"0390df89-e5c5-4684-8ef5-3006aed29cd6","Type":"ContainerDied","Data":"3085d853ca175611b856f9dfc8f0cd377bcc4a07f744db31fff05cfe986c217c"} Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.530611 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97jrf" event={"ID":"0390df89-e5c5-4684-8ef5-3006aed29cd6","Type":"ContainerStarted","Data":"c3b572492ae3da9258ca83baa95e4daaec33fb69d00a9be14b8116334f251fb9"} Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.537764 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c27542f4-64a4-4173-a6d3-e985304f1630","Type":"ContainerStarted","Data":"48b7903425a719bc8a156e736b863f56ed4f9672bd1126826b69161f3e955984"} Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.537791 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c27542f4-64a4-4173-a6d3-e985304f1630","Type":"ContainerStarted","Data":"af04a74720b3c9d27815eef482ab2a9d13cc1df462c2ff0e811444b2a43c52f0"} Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.562820 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.562805964 podStartE2EDuration="1.562805964s" podCreationTimestamp="2025-10-01 12:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:44.558636717 +0000 UTC m=+152.903754193" watchObservedRunningTime="2025-10-01 12:55:44.562805964 +0000 UTC m=+152.907923450" Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.642372 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7mmcp"] Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.657047 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.682256 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dq7rl" Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.706060 4851 patch_prober.go:28] interesting pod/downloads-7954f5f757-qcsbq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.706104 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qcsbq" podUID="928cc097-331d-49d7-8a9d-957e2032f941" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.706180 4851 patch_prober.go:28] interesting pod/downloads-7954f5f757-qcsbq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.706242 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qcsbq" podUID="928cc097-331d-49d7-8a9d-957e2032f941" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.885598 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.913811 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:44 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:44 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:44 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.913871 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.975138 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ad71044-3b99-4ece-a9c2-82e9d072e0cd-kubelet-dir\") pod \"3ad71044-3b99-4ece-a9c2-82e9d072e0cd\" (UID: \"3ad71044-3b99-4ece-a9c2-82e9d072e0cd\") " Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.975206 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ad71044-3b99-4ece-a9c2-82e9d072e0cd-kube-api-access\") pod \"3ad71044-3b99-4ece-a9c2-82e9d072e0cd\" (UID: \"3ad71044-3b99-4ece-a9c2-82e9d072e0cd\") " Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.975576 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ad71044-3b99-4ece-a9c2-82e9d072e0cd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3ad71044-3b99-4ece-a9c2-82e9d072e0cd" (UID: "3ad71044-3b99-4ece-a9c2-82e9d072e0cd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:55:44 crc kubenswrapper[4851]: I1001 12:55:44.982022 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad71044-3b99-4ece-a9c2-82e9d072e0cd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3ad71044-3b99-4ece-a9c2-82e9d072e0cd" (UID: "3ad71044-3b99-4ece-a9c2-82e9d072e0cd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:45 crc kubenswrapper[4851]: I1001 12:55:45.076305 4851 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ad71044-3b99-4ece-a9c2-82e9d072e0cd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:45 crc kubenswrapper[4851]: I1001 12:55:45.076339 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ad71044-3b99-4ece-a9c2-82e9d072e0cd-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:45 crc kubenswrapper[4851]: I1001 12:55:45.558364 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ad71044-3b99-4ece-a9c2-82e9d072e0cd","Type":"ContainerDied","Data":"cb30bce360fd6b03b8500f087f84bfdb8cd2d0a7971de1fff0bb3104e9a718c7"} Oct 01 12:55:45 crc kubenswrapper[4851]: I1001 12:55:45.558407 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb30bce360fd6b03b8500f087f84bfdb8cd2d0a7971de1fff0bb3104e9a718c7" Oct 01 12:55:45 crc kubenswrapper[4851]: I1001 12:55:45.558590 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:55:45 crc kubenswrapper[4851]: I1001 12:55:45.573024 4851 generic.go:334] "Generic (PLEG): container finished" podID="ebac54f4-17ba-4305-aa02-a6676cd35731" containerID="51eff03ff57493e4b6e9716eb4396cc3b36c7bce525c0ebc679dbbab3506f7cf" exitCode=0 Oct 01 12:55:45 crc kubenswrapper[4851]: I1001 12:55:45.573286 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mmcp" event={"ID":"ebac54f4-17ba-4305-aa02-a6676cd35731","Type":"ContainerDied","Data":"51eff03ff57493e4b6e9716eb4396cc3b36c7bce525c0ebc679dbbab3506f7cf"} Oct 01 12:55:45 crc kubenswrapper[4851]: I1001 12:55:45.573338 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mmcp" event={"ID":"ebac54f4-17ba-4305-aa02-a6676cd35731","Type":"ContainerStarted","Data":"df91a77ed2c6991a7dc0c54a37afb57dc0721946bcc8533222c2f89aa9cce410"} Oct 01 12:55:45 crc kubenswrapper[4851]: I1001 12:55:45.574924 4851 generic.go:334] "Generic (PLEG): container finished" podID="c27542f4-64a4-4173-a6d3-e985304f1630" containerID="48b7903425a719bc8a156e736b863f56ed4f9672bd1126826b69161f3e955984" exitCode=0 Oct 01 12:55:45 crc kubenswrapper[4851]: I1001 12:55:45.574952 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c27542f4-64a4-4173-a6d3-e985304f1630","Type":"ContainerDied","Data":"48b7903425a719bc8a156e736b863f56ed4f9672bd1126826b69161f3e955984"} Oct 01 12:55:45 crc kubenswrapper[4851]: I1001 12:55:45.907558 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:45 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:45 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:45 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:45 crc kubenswrapper[4851]: I1001 12:55:45.907897 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:46 crc kubenswrapper[4851]: I1001 12:55:46.377426 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-t26f4" Oct 01 12:55:46 crc kubenswrapper[4851]: I1001 12:55:46.910628 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:46 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:46 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:46 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:46 crc kubenswrapper[4851]: I1001 12:55:46.910682 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:47 crc kubenswrapper[4851]: I1001 12:55:47.908092 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:47 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:47 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:47 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:47 crc kubenswrapper[4851]: I1001 12:55:47.908385 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:48 crc kubenswrapper[4851]: I1001 12:55:48.909379 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:48 crc kubenswrapper[4851]: [-]has-synced failed: reason withheld Oct 01 12:55:48 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:48 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:48 crc kubenswrapper[4851]: I1001 12:55:48.909436 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:49 crc kubenswrapper[4851]: I1001 12:55:49.908277 4851 patch_prober.go:28] interesting pod/router-default-5444994796-qtlzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:55:49 crc kubenswrapper[4851]: [+]has-synced ok Oct 01 12:55:49 crc kubenswrapper[4851]: [+]process-running ok Oct 01 12:55:49 crc kubenswrapper[4851]: healthz check failed Oct 01 12:55:49 crc kubenswrapper[4851]: I1001 12:55:49.908347 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtlzx" podUID="24e14587-7a28-41eb-9184-668739c10654" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:55:50 crc kubenswrapper[4851]: I1001 12:55:50.916227 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:50 crc kubenswrapper[4851]: I1001 12:55:50.925938 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-qtlzx" Oct 01 12:55:53 crc kubenswrapper[4851]: I1001 12:55:53.837988 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:53 crc kubenswrapper[4851]: I1001 12:55:53.844296 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 12:55:54 crc kubenswrapper[4851]: I1001 12:55:54.075756 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:55:54 crc kubenswrapper[4851]: I1001 12:55:54.211174 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c27542f4-64a4-4173-a6d3-e985304f1630-kubelet-dir\") pod \"c27542f4-64a4-4173-a6d3-e985304f1630\" (UID: \"c27542f4-64a4-4173-a6d3-e985304f1630\") " Oct 01 12:55:54 crc kubenswrapper[4851]: I1001 12:55:54.211289 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c27542f4-64a4-4173-a6d3-e985304f1630-kube-api-access\") pod \"c27542f4-64a4-4173-a6d3-e985304f1630\" (UID: \"c27542f4-64a4-4173-a6d3-e985304f1630\") " Oct 01 12:55:54 crc kubenswrapper[4851]: I1001 12:55:54.214192 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c27542f4-64a4-4173-a6d3-e985304f1630-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c27542f4-64a4-4173-a6d3-e985304f1630" (UID: "c27542f4-64a4-4173-a6d3-e985304f1630"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:55:54 crc kubenswrapper[4851]: I1001 12:55:54.222994 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27542f4-64a4-4173-a6d3-e985304f1630-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c27542f4-64a4-4173-a6d3-e985304f1630" (UID: "c27542f4-64a4-4173-a6d3-e985304f1630"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:54 crc kubenswrapper[4851]: I1001 12:55:54.314103 4851 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c27542f4-64a4-4173-a6d3-e985304f1630-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:54 crc kubenswrapper[4851]: I1001 12:55:54.314149 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c27542f4-64a4-4173-a6d3-e985304f1630-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:54 crc kubenswrapper[4851]: I1001 12:55:54.672690 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:55:54 crc kubenswrapper[4851]: I1001 12:55:54.672675 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c27542f4-64a4-4173-a6d3-e985304f1630","Type":"ContainerDied","Data":"af04a74720b3c9d27815eef482ab2a9d13cc1df462c2ff0e811444b2a43c52f0"} Oct 01 12:55:54 crc kubenswrapper[4851]: I1001 12:55:54.672734 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af04a74720b3c9d27815eef482ab2a9d13cc1df462c2ff0e811444b2a43c52f0" Oct 01 12:55:54 crc kubenswrapper[4851]: I1001 12:55:54.711285 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qcsbq" Oct 01 12:55:56 crc kubenswrapper[4851]: I1001 12:55:56.753971 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs\") pod \"network-metrics-daemon-75dqp\" (UID: \"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\") " pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:56 crc kubenswrapper[4851]: I1001 12:55:56.773409 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a8fe88f-cdfe-4415-98a0-4cc8f018a962-metrics-certs\") pod \"network-metrics-daemon-75dqp\" (UID: \"8a8fe88f-cdfe-4415-98a0-4cc8f018a962\") " pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:55:56 crc kubenswrapper[4851]: I1001 12:55:56.943108 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75dqp" Oct 01 12:56:00 crc kubenswrapper[4851]: I1001 12:56:00.050568 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:56:00 crc kubenswrapper[4851]: I1001 12:56:00.051856 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:56:02 crc kubenswrapper[4851]: I1001 12:56:02.025216 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 12:56:04 crc kubenswrapper[4851]: E1001 12:56:04.653782 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 01 12:56:04 crc kubenswrapper[4851]: E1001 12:56:04.654080 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lczlr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2fngg_openshift-marketplace(95f957df-ff36-44c7-b2fc-5d2bf10c87fa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 12:56:04 crc kubenswrapper[4851]: E1001 12:56:04.656091 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2fngg" podUID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" Oct 01 12:56:04 crc kubenswrapper[4851]: E1001 12:56:04.907085 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2fngg" podUID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" Oct 01 12:56:06 crc kubenswrapper[4851]: E1001 12:56:06.202604 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 12:56:06 crc kubenswrapper[4851]: E1001 12:56:06.202777 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p6mnj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-b64mk_openshift-marketplace(7db44d49-9188-4af0-baa3-b612c4a83846): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 12:56:06 crc kubenswrapper[4851]: E1001 12:56:06.203975 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-b64mk" podUID="7db44d49-9188-4af0-baa3-b612c4a83846" Oct 01 12:56:08 crc kubenswrapper[4851]: E1001 12:56:08.884709 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-b64mk" podUID="7db44d49-9188-4af0-baa3-b612c4a83846" Oct 01 12:56:09 crc kubenswrapper[4851]: I1001 12:56:09.426320 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-75dqp"] Oct 01 12:56:09 crc kubenswrapper[4851]: W1001 12:56:09.452614 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a8fe88f_cdfe_4415_98a0_4cc8f018a962.slice/crio-ecb87ea6ad1a5230e6b6dc845960e9f4fa80c2a980059ccbb83441cf06f0a1aa WatchSource:0}: Error finding container ecb87ea6ad1a5230e6b6dc845960e9f4fa80c2a980059ccbb83441cf06f0a1aa: Status 404 returned error can't find the container with id ecb87ea6ad1a5230e6b6dc845960e9f4fa80c2a980059ccbb83441cf06f0a1aa Oct 01 12:56:09 crc kubenswrapper[4851]: I1001 12:56:09.765364 4851 generic.go:334] "Generic (PLEG): container finished" podID="ed8f6b65-9192-41e4-80dd-81688a0714a8" containerID="6067ea7853a4b226f755d78ae40358fc04609e319c688f6ca97bbee3eb258cdf" exitCode=0 Oct 01 12:56:09 crc kubenswrapper[4851]: I1001 12:56:09.765421 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvh4p" event={"ID":"ed8f6b65-9192-41e4-80dd-81688a0714a8","Type":"ContainerDied","Data":"6067ea7853a4b226f755d78ae40358fc04609e319c688f6ca97bbee3eb258cdf"} Oct 01 12:56:09 crc kubenswrapper[4851]: I1001 12:56:09.775137 4851 generic.go:334] "Generic (PLEG): container finished" podID="7d8661b3-299a-4c38-95a6-4216fcb6cd3a" containerID="0ef853f49b9222b49efe5e158161190ce27590e987fb3911acb4cc4e86796f3d" exitCode=0 Oct 01 12:56:09 crc kubenswrapper[4851]: I1001 12:56:09.775436 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56b5n" event={"ID":"7d8661b3-299a-4c38-95a6-4216fcb6cd3a","Type":"ContainerDied","Data":"0ef853f49b9222b49efe5e158161190ce27590e987fb3911acb4cc4e86796f3d"} Oct 01 12:56:09 crc kubenswrapper[4851]: I1001 12:56:09.781598 4851 generic.go:334] "Generic (PLEG): container finished" podID="ebac54f4-17ba-4305-aa02-a6676cd35731" containerID="64c7a2125e47b20aba6dc15d143816dfeae2a51788dd431016a30ac737af6fd8" exitCode=0 Oct 01 12:56:09 crc kubenswrapper[4851]: I1001 12:56:09.781727 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mmcp" event={"ID":"ebac54f4-17ba-4305-aa02-a6676cd35731","Type":"ContainerDied","Data":"64c7a2125e47b20aba6dc15d143816dfeae2a51788dd431016a30ac737af6fd8"} Oct 01 12:56:09 crc kubenswrapper[4851]: I1001 12:56:09.792800 4851 generic.go:334] "Generic (PLEG): container finished" podID="9867a8d1-19de-4887-a5da-5d13588544c0" containerID="4f1826a85dfddccef1ad3d5b32a584b282fa85e080e0ea4006f9694f78131923" exitCode=0 Oct 01 12:56:09 crc kubenswrapper[4851]: I1001 12:56:09.792884 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8b48t" event={"ID":"9867a8d1-19de-4887-a5da-5d13588544c0","Type":"ContainerDied","Data":"4f1826a85dfddccef1ad3d5b32a584b282fa85e080e0ea4006f9694f78131923"} Oct 01 12:56:09 crc kubenswrapper[4851]: I1001 12:56:09.799008 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97jrf" event={"ID":"0390df89-e5c5-4684-8ef5-3006aed29cd6","Type":"ContainerStarted","Data":"2ef89ae48bf3050785e20caaf1e41bdfffde25761d68b539c6fe7a74d689f4f1"} Oct 01 12:56:09 crc kubenswrapper[4851]: I1001 12:56:09.809912 4851 generic.go:334] "Generic (PLEG): container finished" podID="04430c41-7121-4c5d-803b-db06d0dd7237" containerID="fbb54e6a60dfa104ee5a5315d0194e3715fb29fe2262f18954ee39a85c1547d1" exitCode=0 Oct 01 12:56:09 crc kubenswrapper[4851]: I1001 12:56:09.809984 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msbpg" event={"ID":"04430c41-7121-4c5d-803b-db06d0dd7237","Type":"ContainerDied","Data":"fbb54e6a60dfa104ee5a5315d0194e3715fb29fe2262f18954ee39a85c1547d1"} Oct 01 12:56:09 crc kubenswrapper[4851]: I1001 12:56:09.814953 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-75dqp" event={"ID":"8a8fe88f-cdfe-4415-98a0-4cc8f018a962","Type":"ContainerStarted","Data":"31a097df2e232017b4ad443ed8de8531710823419345809fa35b47957a1312cd"} Oct 01 12:56:09 crc kubenswrapper[4851]: I1001 12:56:09.814993 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-75dqp" event={"ID":"8a8fe88f-cdfe-4415-98a0-4cc8f018a962","Type":"ContainerStarted","Data":"ecb87ea6ad1a5230e6b6dc845960e9f4fa80c2a980059ccbb83441cf06f0a1aa"} Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.822783 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msbpg" event={"ID":"04430c41-7121-4c5d-803b-db06d0dd7237","Type":"ContainerStarted","Data":"5ecebf7d8998720bda03d2040ff441c8dc1303ac7da82310b51b756a1ecb5ada"} Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.824010 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-75dqp" event={"ID":"8a8fe88f-cdfe-4415-98a0-4cc8f018a962","Type":"ContainerStarted","Data":"b80b73da2dd6d24effa9edd5621060c4b958c8f93f5d2094f29fbbed6df7c46d"} Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.826111 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvh4p" event={"ID":"ed8f6b65-9192-41e4-80dd-81688a0714a8","Type":"ContainerStarted","Data":"92a419962e5d269d99879f7c0cbab5d91032658846ea9112b8b898dd5676d918"} Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.837129 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56b5n" event={"ID":"7d8661b3-299a-4c38-95a6-4216fcb6cd3a","Type":"ContainerStarted","Data":"bea445f63fb898c71ce7d3766bdc8f100995c752383d421eb8d6160b06cb8560"} Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.847610 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mmcp" event={"ID":"ebac54f4-17ba-4305-aa02-a6676cd35731","Type":"ContainerStarted","Data":"760d0d80c1dc532c09973f337f527787844bc94c8062570a20d51642209724d4"} Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.853629 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8b48t" event={"ID":"9867a8d1-19de-4887-a5da-5d13588544c0","Type":"ContainerStarted","Data":"3863a599b41d770a0d56f379581a195e84cd518eeba5c2b0c2487285bd0cd119"} Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.855644 4851 generic.go:334] "Generic (PLEG): container finished" podID="0390df89-e5c5-4684-8ef5-3006aed29cd6" containerID="2ef89ae48bf3050785e20caaf1e41bdfffde25761d68b539c6fe7a74d689f4f1" exitCode=0 Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.855687 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97jrf" event={"ID":"0390df89-e5c5-4684-8ef5-3006aed29cd6","Type":"ContainerDied","Data":"2ef89ae48bf3050785e20caaf1e41bdfffde25761d68b539c6fe7a74d689f4f1"} Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.855710 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97jrf" event={"ID":"0390df89-e5c5-4684-8ef5-3006aed29cd6","Type":"ContainerStarted","Data":"41e33532d385dadaba6ed68d92efeca67ec5c0b6719d4aa3e5735fea3bf7b433"} Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.866505 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-msbpg" podStartSLOduration=2.923556688 podStartE2EDuration="29.866477835s" podCreationTimestamp="2025-10-01 12:55:41 +0000 UTC" firstStartedPulling="2025-10-01 12:55:43.503049129 +0000 UTC m=+151.848166615" lastFinishedPulling="2025-10-01 12:56:10.445970246 +0000 UTC m=+178.791087762" observedRunningTime="2025-10-01 12:56:10.864859499 +0000 UTC m=+179.209976995" watchObservedRunningTime="2025-10-01 12:56:10.866477835 +0000 UTC m=+179.211595321" Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.896909 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xvh4p" podStartSLOduration=2.772063587 podStartE2EDuration="30.896887254s" podCreationTimestamp="2025-10-01 12:55:40 +0000 UTC" firstStartedPulling="2025-10-01 12:55:42.436083436 +0000 UTC m=+150.781200922" lastFinishedPulling="2025-10-01 12:56:10.560907043 +0000 UTC m=+178.906024589" observedRunningTime="2025-10-01 12:56:10.882152768 +0000 UTC m=+179.227270264" watchObservedRunningTime="2025-10-01 12:56:10.896887254 +0000 UTC m=+179.242004750" Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.898185 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-56b5n" podStartSLOduration=2.027653097 podStartE2EDuration="28.8981783s" podCreationTimestamp="2025-10-01 12:55:42 +0000 UTC" firstStartedPulling="2025-10-01 12:55:43.500843476 +0000 UTC m=+151.845960962" lastFinishedPulling="2025-10-01 12:56:10.371368639 +0000 UTC m=+178.716486165" observedRunningTime="2025-10-01 12:56:10.895815364 +0000 UTC m=+179.240932860" watchObservedRunningTime="2025-10-01 12:56:10.8981783 +0000 UTC m=+179.243295796" Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.937997 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-75dqp" podStartSLOduration=156.937968514 podStartE2EDuration="2m36.937968514s" podCreationTimestamp="2025-10-01 12:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:10.932013916 +0000 UTC m=+179.277131442" watchObservedRunningTime="2025-10-01 12:56:10.937968514 +0000 UTC m=+179.283086010" Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.938396 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8b48t" podStartSLOduration=3.067843825 podStartE2EDuration="31.938382506s" podCreationTimestamp="2025-10-01 12:55:39 +0000 UTC" firstStartedPulling="2025-10-01 12:55:41.402209672 +0000 UTC m=+149.747327158" lastFinishedPulling="2025-10-01 12:56:10.272748353 +0000 UTC m=+178.617865839" observedRunningTime="2025-10-01 12:56:10.916899769 +0000 UTC m=+179.262017255" watchObservedRunningTime="2025-10-01 12:56:10.938382506 +0000 UTC m=+179.283500002" Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.958646 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-97jrf" podStartSLOduration=2.051634053 podStartE2EDuration="27.958626598s" podCreationTimestamp="2025-10-01 12:55:43 +0000 UTC" firstStartedPulling="2025-10-01 12:55:44.532115387 +0000 UTC m=+152.877232873" lastFinishedPulling="2025-10-01 12:56:10.439107892 +0000 UTC m=+178.784225418" observedRunningTime="2025-10-01 12:56:10.95443916 +0000 UTC m=+179.299556646" watchObservedRunningTime="2025-10-01 12:56:10.958626598 +0000 UTC m=+179.303744094" Oct 01 12:56:10 crc kubenswrapper[4851]: I1001 12:56:10.971325 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7mmcp" podStartSLOduration=3.314169428 podStartE2EDuration="27.971308496s" podCreationTimestamp="2025-10-01 12:55:43 +0000 UTC" firstStartedPulling="2025-10-01 12:55:45.589623981 +0000 UTC m=+153.934741477" lastFinishedPulling="2025-10-01 12:56:10.246763049 +0000 UTC m=+178.591880545" observedRunningTime="2025-10-01 12:56:10.968832306 +0000 UTC m=+179.313949792" watchObservedRunningTime="2025-10-01 12:56:10.971308496 +0000 UTC m=+179.316425982" Oct 01 12:56:12 crc kubenswrapper[4851]: I1001 12:56:12.367821 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:56:12 crc kubenswrapper[4851]: I1001 12:56:12.368071 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:56:12 crc kubenswrapper[4851]: I1001 12:56:12.476435 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:56:12 crc kubenswrapper[4851]: I1001 12:56:12.609426 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:56:12 crc kubenswrapper[4851]: I1001 12:56:12.609467 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:56:12 crc kubenswrapper[4851]: I1001 12:56:12.667402 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:56:13 crc kubenswrapper[4851]: I1001 12:56:13.636313 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:56:13 crc kubenswrapper[4851]: I1001 12:56:13.636573 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:56:14 crc kubenswrapper[4851]: I1001 12:56:14.088369 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:56:14 crc kubenswrapper[4851]: I1001 12:56:14.088841 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:56:14 crc kubenswrapper[4851]: I1001 12:56:14.309450 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2kfkj" Oct 01 12:56:14 crc kubenswrapper[4851]: I1001 12:56:14.719124 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-97jrf" podUID="0390df89-e5c5-4684-8ef5-3006aed29cd6" containerName="registry-server" probeResult="failure" output=< Oct 01 12:56:14 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 12:56:14 crc kubenswrapper[4851]: > Oct 01 12:56:15 crc kubenswrapper[4851]: I1001 12:56:15.152719 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7mmcp" podUID="ebac54f4-17ba-4305-aa02-a6676cd35731" containerName="registry-server" probeResult="failure" output=< Oct 01 12:56:15 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 12:56:15 crc kubenswrapper[4851]: > Oct 01 12:56:20 crc kubenswrapper[4851]: I1001 12:56:20.230968 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:56:20 crc kubenswrapper[4851]: I1001 12:56:20.231667 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:56:20 crc kubenswrapper[4851]: I1001 12:56:20.292450 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:56:20 crc kubenswrapper[4851]: I1001 12:56:20.420705 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:56:20 crc kubenswrapper[4851]: I1001 12:56:20.420772 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:56:20 crc kubenswrapper[4851]: I1001 12:56:20.475901 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:56:20 crc kubenswrapper[4851]: I1001 12:56:20.490885 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:56:20 crc kubenswrapper[4851]: I1001 12:56:20.976432 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:56:20 crc kubenswrapper[4851]: I1001 12:56:20.998286 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:56:22 crc kubenswrapper[4851]: I1001 12:56:22.372564 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:56:22 crc kubenswrapper[4851]: I1001 12:56:22.644398 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:56:22 crc kubenswrapper[4851]: I1001 12:56:22.940282 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fngg" event={"ID":"95f957df-ff36-44c7-b2fc-5d2bf10c87fa","Type":"ContainerStarted","Data":"4a88f5ed478e4ccd9062a21ea8dacd543eafe4688f20a1c0b81e9f3afe5abc68"} Oct 01 12:56:23 crc kubenswrapper[4851]: I1001 12:56:23.702639 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:56:23 crc kubenswrapper[4851]: I1001 12:56:23.760004 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:56:23 crc kubenswrapper[4851]: I1001 12:56:23.948990 4851 generic.go:334] "Generic (PLEG): container finished" podID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" containerID="4a88f5ed478e4ccd9062a21ea8dacd543eafe4688f20a1c0b81e9f3afe5abc68" exitCode=0 Oct 01 12:56:23 crc kubenswrapper[4851]: I1001 12:56:23.949042 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fngg" event={"ID":"95f957df-ff36-44c7-b2fc-5d2bf10c87fa","Type":"ContainerDied","Data":"4a88f5ed478e4ccd9062a21ea8dacd543eafe4688f20a1c0b81e9f3afe5abc68"} Oct 01 12:56:24 crc kubenswrapper[4851]: I1001 12:56:24.139851 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:56:24 crc kubenswrapper[4851]: I1001 12:56:24.186321 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:56:25 crc kubenswrapper[4851]: I1001 12:56:25.374815 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-56b5n"] Oct 01 12:56:25 crc kubenswrapper[4851]: I1001 12:56:25.375482 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-56b5n" podUID="7d8661b3-299a-4c38-95a6-4216fcb6cd3a" containerName="registry-server" containerID="cri-o://bea445f63fb898c71ce7d3766bdc8f100995c752383d421eb8d6160b06cb8560" gracePeriod=2 Oct 01 12:56:25 crc kubenswrapper[4851]: I1001 12:56:25.965576 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56b5n" event={"ID":"7d8661b3-299a-4c38-95a6-4216fcb6cd3a","Type":"ContainerDied","Data":"bea445f63fb898c71ce7d3766bdc8f100995c752383d421eb8d6160b06cb8560"} Oct 01 12:56:25 crc kubenswrapper[4851]: I1001 12:56:25.965617 4851 generic.go:334] "Generic (PLEG): container finished" podID="7d8661b3-299a-4c38-95a6-4216fcb6cd3a" containerID="bea445f63fb898c71ce7d3766bdc8f100995c752383d421eb8d6160b06cb8560" exitCode=0 Oct 01 12:56:25 crc kubenswrapper[4851]: I1001 12:56:25.979898 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7mmcp"] Oct 01 12:56:25 crc kubenswrapper[4851]: I1001 12:56:25.980232 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7mmcp" podUID="ebac54f4-17ba-4305-aa02-a6676cd35731" containerName="registry-server" containerID="cri-o://760d0d80c1dc532c09973f337f527787844bc94c8062570a20d51642209724d4" gracePeriod=2 Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.608234 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.718878 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjbmw\" (UniqueName: \"kubernetes.io/projected/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-kube-api-access-rjbmw\") pod \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\" (UID: \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\") " Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.718960 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-catalog-content\") pod \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\" (UID: \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\") " Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.719044 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-utilities\") pod \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\" (UID: \"7d8661b3-299a-4c38-95a6-4216fcb6cd3a\") " Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.720554 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-utilities" (OuterVolumeSpecName: "utilities") pod "7d8661b3-299a-4c38-95a6-4216fcb6cd3a" (UID: "7d8661b3-299a-4c38-95a6-4216fcb6cd3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.720919 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.725026 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-kube-api-access-rjbmw" (OuterVolumeSpecName: "kube-api-access-rjbmw") pod "7d8661b3-299a-4c38-95a6-4216fcb6cd3a" (UID: "7d8661b3-299a-4c38-95a6-4216fcb6cd3a"). InnerVolumeSpecName "kube-api-access-rjbmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.749184 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d8661b3-299a-4c38-95a6-4216fcb6cd3a" (UID: "7d8661b3-299a-4c38-95a6-4216fcb6cd3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.822800 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjbmw\" (UniqueName: \"kubernetes.io/projected/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-kube-api-access-rjbmw\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.822866 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8661b3-299a-4c38-95a6-4216fcb6cd3a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.977338 4851 generic.go:334] "Generic (PLEG): container finished" podID="ebac54f4-17ba-4305-aa02-a6676cd35731" containerID="760d0d80c1dc532c09973f337f527787844bc94c8062570a20d51642209724d4" exitCode=0 Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.977422 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mmcp" event={"ID":"ebac54f4-17ba-4305-aa02-a6676cd35731","Type":"ContainerDied","Data":"760d0d80c1dc532c09973f337f527787844bc94c8062570a20d51642209724d4"} Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.980418 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56b5n" event={"ID":"7d8661b3-299a-4c38-95a6-4216fcb6cd3a","Type":"ContainerDied","Data":"922c8270e2d6232249cfcd26160dbf0a40f2d32da7e4190a07baaeb564aeab24"} Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.980473 4851 scope.go:117] "RemoveContainer" containerID="bea445f63fb898c71ce7d3766bdc8f100995c752383d421eb8d6160b06cb8560" Oct 01 12:56:26 crc kubenswrapper[4851]: I1001 12:56:26.980670 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-56b5n" Oct 01 12:56:27 crc kubenswrapper[4851]: I1001 12:56:27.024111 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-56b5n"] Oct 01 12:56:27 crc kubenswrapper[4851]: I1001 12:56:27.030175 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-56b5n"] Oct 01 12:56:27 crc kubenswrapper[4851]: I1001 12:56:27.805634 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:56:27 crc kubenswrapper[4851]: I1001 12:56:27.936625 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebac54f4-17ba-4305-aa02-a6676cd35731-catalog-content\") pod \"ebac54f4-17ba-4305-aa02-a6676cd35731\" (UID: \"ebac54f4-17ba-4305-aa02-a6676cd35731\") " Oct 01 12:56:27 crc kubenswrapper[4851]: I1001 12:56:27.936678 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nc9v\" (UniqueName: \"kubernetes.io/projected/ebac54f4-17ba-4305-aa02-a6676cd35731-kube-api-access-7nc9v\") pod \"ebac54f4-17ba-4305-aa02-a6676cd35731\" (UID: \"ebac54f4-17ba-4305-aa02-a6676cd35731\") " Oct 01 12:56:27 crc kubenswrapper[4851]: I1001 12:56:27.936721 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebac54f4-17ba-4305-aa02-a6676cd35731-utilities\") pod \"ebac54f4-17ba-4305-aa02-a6676cd35731\" (UID: \"ebac54f4-17ba-4305-aa02-a6676cd35731\") " Oct 01 12:56:27 crc kubenswrapper[4851]: I1001 12:56:27.938161 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebac54f4-17ba-4305-aa02-a6676cd35731-utilities" (OuterVolumeSpecName: "utilities") pod "ebac54f4-17ba-4305-aa02-a6676cd35731" (UID: "ebac54f4-17ba-4305-aa02-a6676cd35731"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:27 crc kubenswrapper[4851]: I1001 12:56:27.948878 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebac54f4-17ba-4305-aa02-a6676cd35731-kube-api-access-7nc9v" (OuterVolumeSpecName: "kube-api-access-7nc9v") pod "ebac54f4-17ba-4305-aa02-a6676cd35731" (UID: "ebac54f4-17ba-4305-aa02-a6676cd35731"). InnerVolumeSpecName "kube-api-access-7nc9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:28 crc kubenswrapper[4851]: I1001 12:56:28.002454 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mmcp" event={"ID":"ebac54f4-17ba-4305-aa02-a6676cd35731","Type":"ContainerDied","Data":"df91a77ed2c6991a7dc0c54a37afb57dc0721946bcc8533222c2f89aa9cce410"} Oct 01 12:56:28 crc kubenswrapper[4851]: I1001 12:56:28.002589 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mmcp" Oct 01 12:56:28 crc kubenswrapper[4851]: I1001 12:56:28.038566 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebac54f4-17ba-4305-aa02-a6676cd35731-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:28 crc kubenswrapper[4851]: I1001 12:56:28.038609 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nc9v\" (UniqueName: \"kubernetes.io/projected/ebac54f4-17ba-4305-aa02-a6676cd35731-kube-api-access-7nc9v\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:28 crc kubenswrapper[4851]: I1001 12:56:28.268150 4851 scope.go:117] "RemoveContainer" containerID="0ef853f49b9222b49efe5e158161190ce27590e987fb3911acb4cc4e86796f3d" Oct 01 12:56:28 crc kubenswrapper[4851]: I1001 12:56:28.347753 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d8661b3-299a-4c38-95a6-4216fcb6cd3a" path="/var/lib/kubelet/pods/7d8661b3-299a-4c38-95a6-4216fcb6cd3a/volumes" Oct 01 12:56:28 crc kubenswrapper[4851]: I1001 12:56:28.685784 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebac54f4-17ba-4305-aa02-a6676cd35731-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebac54f4-17ba-4305-aa02-a6676cd35731" (UID: "ebac54f4-17ba-4305-aa02-a6676cd35731"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:28 crc kubenswrapper[4851]: I1001 12:56:28.752961 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebac54f4-17ba-4305-aa02-a6676cd35731-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:28 crc kubenswrapper[4851]: I1001 12:56:28.946428 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7mmcp"] Oct 01 12:56:28 crc kubenswrapper[4851]: I1001 12:56:28.951051 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7mmcp"] Oct 01 12:56:28 crc kubenswrapper[4851]: I1001 12:56:28.990921 4851 scope.go:117] "RemoveContainer" containerID="702bd40760aa55b98f9c57edbc524055443a79bd6bbcb6d65fd7b04d1a2cb6f9" Oct 01 12:56:29 crc kubenswrapper[4851]: I1001 12:56:29.012889 4851 scope.go:117] "RemoveContainer" containerID="760d0d80c1dc532c09973f337f527787844bc94c8062570a20d51642209724d4" Oct 01 12:56:29 crc kubenswrapper[4851]: I1001 12:56:29.097456 4851 scope.go:117] "RemoveContainer" containerID="64c7a2125e47b20aba6dc15d143816dfeae2a51788dd431016a30ac737af6fd8" Oct 01 12:56:29 crc kubenswrapper[4851]: I1001 12:56:29.115206 4851 scope.go:117] "RemoveContainer" containerID="51eff03ff57493e4b6e9716eb4396cc3b36c7bce525c0ebc679dbbab3506f7cf" Oct 01 12:56:30 crc kubenswrapper[4851]: I1001 12:56:30.022885 4851 generic.go:334] "Generic (PLEG): container finished" podID="7db44d49-9188-4af0-baa3-b612c4a83846" containerID="3cff8b972494166c4dfed36afe6a1c42000fe535f3828146037147868a579c8f" exitCode=0 Oct 01 12:56:30 crc kubenswrapper[4851]: I1001 12:56:30.023113 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b64mk" event={"ID":"7db44d49-9188-4af0-baa3-b612c4a83846","Type":"ContainerDied","Data":"3cff8b972494166c4dfed36afe6a1c42000fe535f3828146037147868a579c8f"} Oct 01 12:56:30 crc kubenswrapper[4851]: I1001 12:56:30.028161 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fngg" event={"ID":"95f957df-ff36-44c7-b2fc-5d2bf10c87fa","Type":"ContainerStarted","Data":"65568bd4c488be0f791977e8a0f6a81e2ddea0b2568320c118de1b6fbf05cf7a"} Oct 01 12:56:30 crc kubenswrapper[4851]: I1001 12:56:30.050959 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:56:30 crc kubenswrapper[4851]: I1001 12:56:30.051044 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:56:30 crc kubenswrapper[4851]: I1001 12:56:30.076607 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2fngg" podStartSLOduration=3.567789921 podStartE2EDuration="50.076587842s" podCreationTimestamp="2025-10-01 12:55:40 +0000 UTC" firstStartedPulling="2025-10-01 12:55:42.482531799 +0000 UTC m=+150.827649285" lastFinishedPulling="2025-10-01 12:56:28.99132968 +0000 UTC m=+197.336447206" observedRunningTime="2025-10-01 12:56:30.07511178 +0000 UTC m=+198.420229276" watchObservedRunningTime="2025-10-01 12:56:30.076587842 +0000 UTC m=+198.421705328" Oct 01 12:56:30 crc kubenswrapper[4851]: I1001 12:56:30.337655 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebac54f4-17ba-4305-aa02-a6676cd35731" path="/var/lib/kubelet/pods/ebac54f4-17ba-4305-aa02-a6676cd35731/volumes" Oct 01 12:56:30 crc kubenswrapper[4851]: I1001 12:56:30.853531 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:56:30 crc kubenswrapper[4851]: I1001 12:56:30.853578 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:56:31 crc kubenswrapper[4851]: I1001 12:56:31.902601 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-2fngg" podUID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" containerName="registry-server" probeResult="failure" output=< Oct 01 12:56:31 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 12:56:31 crc kubenswrapper[4851]: > Oct 01 12:56:33 crc kubenswrapper[4851]: I1001 12:56:33.056832 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b64mk" event={"ID":"7db44d49-9188-4af0-baa3-b612c4a83846","Type":"ContainerStarted","Data":"d7a58edac8f6cf797924e960c682b0ccdcb37706445d120706309af6c0e25fac"} Oct 01 12:56:34 crc kubenswrapper[4851]: I1001 12:56:34.085718 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b64mk" podStartSLOduration=3.876267081 podStartE2EDuration="54.085700254s" podCreationTimestamp="2025-10-01 12:55:40 +0000 UTC" firstStartedPulling="2025-10-01 12:55:42.467310088 +0000 UTC m=+150.812427574" lastFinishedPulling="2025-10-01 12:56:32.676743221 +0000 UTC m=+201.021860747" observedRunningTime="2025-10-01 12:56:34.084386536 +0000 UTC m=+202.429504032" watchObservedRunningTime="2025-10-01 12:56:34.085700254 +0000 UTC m=+202.430817750" Oct 01 12:56:40 crc kubenswrapper[4851]: I1001 12:56:40.643250 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:56:40 crc kubenswrapper[4851]: I1001 12:56:40.643962 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:56:40 crc kubenswrapper[4851]: I1001 12:56:40.711424 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:56:40 crc kubenswrapper[4851]: I1001 12:56:40.893990 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:56:40 crc kubenswrapper[4851]: I1001 12:56:40.937355 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:56:41 crc kubenswrapper[4851]: I1001 12:56:41.074766 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6z8fq"] Oct 01 12:56:41 crc kubenswrapper[4851]: I1001 12:56:41.164376 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:56:42 crc kubenswrapper[4851]: I1001 12:56:42.578466 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2fngg"] Oct 01 12:56:42 crc kubenswrapper[4851]: I1001 12:56:42.578893 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2fngg" podUID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" containerName="registry-server" containerID="cri-o://65568bd4c488be0f791977e8a0f6a81e2ddea0b2568320c118de1b6fbf05cf7a" gracePeriod=2 Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.006547 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.118266 4851 generic.go:334] "Generic (PLEG): container finished" podID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" containerID="65568bd4c488be0f791977e8a0f6a81e2ddea0b2568320c118de1b6fbf05cf7a" exitCode=0 Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.118446 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fngg" event={"ID":"95f957df-ff36-44c7-b2fc-5d2bf10c87fa","Type":"ContainerDied","Data":"65568bd4c488be0f791977e8a0f6a81e2ddea0b2568320c118de1b6fbf05cf7a"} Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.118940 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fngg" event={"ID":"95f957df-ff36-44c7-b2fc-5d2bf10c87fa","Type":"ContainerDied","Data":"309fdd69d72b01012fa76c2d696bc8fc0c53c1e38626e762902ed817b659cfac"} Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.119104 4851 scope.go:117] "RemoveContainer" containerID="65568bd4c488be0f791977e8a0f6a81e2ddea0b2568320c118de1b6fbf05cf7a" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.118555 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fngg" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.139436 4851 scope.go:117] "RemoveContainer" containerID="4a88f5ed478e4ccd9062a21ea8dacd543eafe4688f20a1c0b81e9f3afe5abc68" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.158339 4851 scope.go:117] "RemoveContainer" containerID="4ef204ce078aa4bed8adaac846e664e5b218d50f2707d2e095480bc754915d76" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.173114 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-catalog-content\") pod \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\" (UID: \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\") " Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.173463 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-utilities\") pod \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\" (UID: \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\") " Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.174263 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lczlr\" (UniqueName: \"kubernetes.io/projected/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-kube-api-access-lczlr\") pod \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\" (UID: \"95f957df-ff36-44c7-b2fc-5d2bf10c87fa\") " Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.174969 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-utilities" (OuterVolumeSpecName: "utilities") pod "95f957df-ff36-44c7-b2fc-5d2bf10c87fa" (UID: "95f957df-ff36-44c7-b2fc-5d2bf10c87fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.175376 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.180008 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-kube-api-access-lczlr" (OuterVolumeSpecName: "kube-api-access-lczlr") pod "95f957df-ff36-44c7-b2fc-5d2bf10c87fa" (UID: "95f957df-ff36-44c7-b2fc-5d2bf10c87fa"). InnerVolumeSpecName "kube-api-access-lczlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.196363 4851 scope.go:117] "RemoveContainer" containerID="65568bd4c488be0f791977e8a0f6a81e2ddea0b2568320c118de1b6fbf05cf7a" Oct 01 12:56:43 crc kubenswrapper[4851]: E1001 12:56:43.197631 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65568bd4c488be0f791977e8a0f6a81e2ddea0b2568320c118de1b6fbf05cf7a\": container with ID starting with 65568bd4c488be0f791977e8a0f6a81e2ddea0b2568320c118de1b6fbf05cf7a not found: ID does not exist" containerID="65568bd4c488be0f791977e8a0f6a81e2ddea0b2568320c118de1b6fbf05cf7a" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.197924 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65568bd4c488be0f791977e8a0f6a81e2ddea0b2568320c118de1b6fbf05cf7a"} err="failed to get container status \"65568bd4c488be0f791977e8a0f6a81e2ddea0b2568320c118de1b6fbf05cf7a\": rpc error: code = NotFound desc = could not find container \"65568bd4c488be0f791977e8a0f6a81e2ddea0b2568320c118de1b6fbf05cf7a\": container with ID starting with 65568bd4c488be0f791977e8a0f6a81e2ddea0b2568320c118de1b6fbf05cf7a not found: ID does not exist" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.198193 4851 scope.go:117] "RemoveContainer" containerID="4a88f5ed478e4ccd9062a21ea8dacd543eafe4688f20a1c0b81e9f3afe5abc68" Oct 01 12:56:43 crc kubenswrapper[4851]: E1001 12:56:43.199254 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a88f5ed478e4ccd9062a21ea8dacd543eafe4688f20a1c0b81e9f3afe5abc68\": container with ID starting with 4a88f5ed478e4ccd9062a21ea8dacd543eafe4688f20a1c0b81e9f3afe5abc68 not found: ID does not exist" containerID="4a88f5ed478e4ccd9062a21ea8dacd543eafe4688f20a1c0b81e9f3afe5abc68" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.199591 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a88f5ed478e4ccd9062a21ea8dacd543eafe4688f20a1c0b81e9f3afe5abc68"} err="failed to get container status \"4a88f5ed478e4ccd9062a21ea8dacd543eafe4688f20a1c0b81e9f3afe5abc68\": rpc error: code = NotFound desc = could not find container \"4a88f5ed478e4ccd9062a21ea8dacd543eafe4688f20a1c0b81e9f3afe5abc68\": container with ID starting with 4a88f5ed478e4ccd9062a21ea8dacd543eafe4688f20a1c0b81e9f3afe5abc68 not found: ID does not exist" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.199855 4851 scope.go:117] "RemoveContainer" containerID="4ef204ce078aa4bed8adaac846e664e5b218d50f2707d2e095480bc754915d76" Oct 01 12:56:43 crc kubenswrapper[4851]: E1001 12:56:43.201904 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ef204ce078aa4bed8adaac846e664e5b218d50f2707d2e095480bc754915d76\": container with ID starting with 4ef204ce078aa4bed8adaac846e664e5b218d50f2707d2e095480bc754915d76 not found: ID does not exist" containerID="4ef204ce078aa4bed8adaac846e664e5b218d50f2707d2e095480bc754915d76" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.201964 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ef204ce078aa4bed8adaac846e664e5b218d50f2707d2e095480bc754915d76"} err="failed to get container status \"4ef204ce078aa4bed8adaac846e664e5b218d50f2707d2e095480bc754915d76\": rpc error: code = NotFound desc = could not find container \"4ef204ce078aa4bed8adaac846e664e5b218d50f2707d2e095480bc754915d76\": container with ID starting with 4ef204ce078aa4bed8adaac846e664e5b218d50f2707d2e095480bc754915d76 not found: ID does not exist" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.231307 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95f957df-ff36-44c7-b2fc-5d2bf10c87fa" (UID: "95f957df-ff36-44c7-b2fc-5d2bf10c87fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.277196 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.277227 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lczlr\" (UniqueName: \"kubernetes.io/projected/95f957df-ff36-44c7-b2fc-5d2bf10c87fa-kube-api-access-lczlr\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.459478 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2fngg"] Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.462025 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2fngg"] Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.571595 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b64mk"] Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.571799 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b64mk" podUID="7db44d49-9188-4af0-baa3-b612c4a83846" containerName="registry-server" containerID="cri-o://d7a58edac8f6cf797924e960c682b0ccdcb37706445d120706309af6c0e25fac" gracePeriod=2 Oct 01 12:56:43 crc kubenswrapper[4851]: I1001 12:56:43.971928 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.085308 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db44d49-9188-4af0-baa3-b612c4a83846-utilities\") pod \"7db44d49-9188-4af0-baa3-b612c4a83846\" (UID: \"7db44d49-9188-4af0-baa3-b612c4a83846\") " Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.085437 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6mnj\" (UniqueName: \"kubernetes.io/projected/7db44d49-9188-4af0-baa3-b612c4a83846-kube-api-access-p6mnj\") pod \"7db44d49-9188-4af0-baa3-b612c4a83846\" (UID: \"7db44d49-9188-4af0-baa3-b612c4a83846\") " Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.085544 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db44d49-9188-4af0-baa3-b612c4a83846-catalog-content\") pod \"7db44d49-9188-4af0-baa3-b612c4a83846\" (UID: \"7db44d49-9188-4af0-baa3-b612c4a83846\") " Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.086170 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db44d49-9188-4af0-baa3-b612c4a83846-utilities" (OuterVolumeSpecName: "utilities") pod "7db44d49-9188-4af0-baa3-b612c4a83846" (UID: "7db44d49-9188-4af0-baa3-b612c4a83846"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.091675 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db44d49-9188-4af0-baa3-b612c4a83846-kube-api-access-p6mnj" (OuterVolumeSpecName: "kube-api-access-p6mnj") pod "7db44d49-9188-4af0-baa3-b612c4a83846" (UID: "7db44d49-9188-4af0-baa3-b612c4a83846"). InnerVolumeSpecName "kube-api-access-p6mnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.130097 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b64mk" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.130143 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b64mk" event={"ID":"7db44d49-9188-4af0-baa3-b612c4a83846","Type":"ContainerDied","Data":"d7a58edac8f6cf797924e960c682b0ccdcb37706445d120706309af6c0e25fac"} Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.130218 4851 scope.go:117] "RemoveContainer" containerID="d7a58edac8f6cf797924e960c682b0ccdcb37706445d120706309af6c0e25fac" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.129991 4851 generic.go:334] "Generic (PLEG): container finished" podID="7db44d49-9188-4af0-baa3-b612c4a83846" containerID="d7a58edac8f6cf797924e960c682b0ccdcb37706445d120706309af6c0e25fac" exitCode=0 Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.130942 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b64mk" event={"ID":"7db44d49-9188-4af0-baa3-b612c4a83846","Type":"ContainerDied","Data":"97d9daf8ac84367da2be08d2b03fae377d7b6240c5213e85791b9520e8eee116"} Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.134173 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db44d49-9188-4af0-baa3-b612c4a83846-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7db44d49-9188-4af0-baa3-b612c4a83846" (UID: "7db44d49-9188-4af0-baa3-b612c4a83846"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.149066 4851 scope.go:117] "RemoveContainer" containerID="3cff8b972494166c4dfed36afe6a1c42000fe535f3828146037147868a579c8f" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.169898 4851 scope.go:117] "RemoveContainer" containerID="97e3a3a06a38e8d628418eed82ee158cd79489615f07940cd4792af3fd527721" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.186372 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db44d49-9188-4af0-baa3-b612c4a83846-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.186396 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db44d49-9188-4af0-baa3-b612c4a83846-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.186409 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6mnj\" (UniqueName: \"kubernetes.io/projected/7db44d49-9188-4af0-baa3-b612c4a83846-kube-api-access-p6mnj\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.191235 4851 scope.go:117] "RemoveContainer" containerID="d7a58edac8f6cf797924e960c682b0ccdcb37706445d120706309af6c0e25fac" Oct 01 12:56:44 crc kubenswrapper[4851]: E1001 12:56:44.191685 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7a58edac8f6cf797924e960c682b0ccdcb37706445d120706309af6c0e25fac\": container with ID starting with d7a58edac8f6cf797924e960c682b0ccdcb37706445d120706309af6c0e25fac not found: ID does not exist" containerID="d7a58edac8f6cf797924e960c682b0ccdcb37706445d120706309af6c0e25fac" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.191719 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a58edac8f6cf797924e960c682b0ccdcb37706445d120706309af6c0e25fac"} err="failed to get container status \"d7a58edac8f6cf797924e960c682b0ccdcb37706445d120706309af6c0e25fac\": rpc error: code = NotFound desc = could not find container \"d7a58edac8f6cf797924e960c682b0ccdcb37706445d120706309af6c0e25fac\": container with ID starting with d7a58edac8f6cf797924e960c682b0ccdcb37706445d120706309af6c0e25fac not found: ID does not exist" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.191745 4851 scope.go:117] "RemoveContainer" containerID="3cff8b972494166c4dfed36afe6a1c42000fe535f3828146037147868a579c8f" Oct 01 12:56:44 crc kubenswrapper[4851]: E1001 12:56:44.192076 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cff8b972494166c4dfed36afe6a1c42000fe535f3828146037147868a579c8f\": container with ID starting with 3cff8b972494166c4dfed36afe6a1c42000fe535f3828146037147868a579c8f not found: ID does not exist" containerID="3cff8b972494166c4dfed36afe6a1c42000fe535f3828146037147868a579c8f" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.192098 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cff8b972494166c4dfed36afe6a1c42000fe535f3828146037147868a579c8f"} err="failed to get container status \"3cff8b972494166c4dfed36afe6a1c42000fe535f3828146037147868a579c8f\": rpc error: code = NotFound desc = could not find container \"3cff8b972494166c4dfed36afe6a1c42000fe535f3828146037147868a579c8f\": container with ID starting with 3cff8b972494166c4dfed36afe6a1c42000fe535f3828146037147868a579c8f not found: ID does not exist" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.192111 4851 scope.go:117] "RemoveContainer" containerID="97e3a3a06a38e8d628418eed82ee158cd79489615f07940cd4792af3fd527721" Oct 01 12:56:44 crc kubenswrapper[4851]: E1001 12:56:44.192632 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97e3a3a06a38e8d628418eed82ee158cd79489615f07940cd4792af3fd527721\": container with ID starting with 97e3a3a06a38e8d628418eed82ee158cd79489615f07940cd4792af3fd527721 not found: ID does not exist" containerID="97e3a3a06a38e8d628418eed82ee158cd79489615f07940cd4792af3fd527721" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.192696 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97e3a3a06a38e8d628418eed82ee158cd79489615f07940cd4792af3fd527721"} err="failed to get container status \"97e3a3a06a38e8d628418eed82ee158cd79489615f07940cd4792af3fd527721\": rpc error: code = NotFound desc = could not find container \"97e3a3a06a38e8d628418eed82ee158cd79489615f07940cd4792af3fd527721\": container with ID starting with 97e3a3a06a38e8d628418eed82ee158cd79489615f07940cd4792af3fd527721 not found: ID does not exist" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.334956 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" path="/var/lib/kubelet/pods/95f957df-ff36-44c7-b2fc-5d2bf10c87fa/volumes" Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.450995 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b64mk"] Oct 01 12:56:44 crc kubenswrapper[4851]: I1001 12:56:44.453613 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b64mk"] Oct 01 12:56:46 crc kubenswrapper[4851]: I1001 12:56:46.339763 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7db44d49-9188-4af0-baa3-b612c4a83846" path="/var/lib/kubelet/pods/7db44d49-9188-4af0-baa3-b612c4a83846/volumes" Oct 01 12:57:00 crc kubenswrapper[4851]: I1001 12:57:00.050208 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:57:00 crc kubenswrapper[4851]: I1001 12:57:00.050819 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:57:00 crc kubenswrapper[4851]: I1001 12:57:00.050880 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 12:57:00 crc kubenswrapper[4851]: I1001 12:57:00.051703 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:57:00 crc kubenswrapper[4851]: I1001 12:57:00.051800 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6" gracePeriod=600 Oct 01 12:57:00 crc kubenswrapper[4851]: I1001 12:57:00.229008 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6" exitCode=0 Oct 01 12:57:00 crc kubenswrapper[4851]: I1001 12:57:00.229205 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6"} Oct 01 12:57:01 crc kubenswrapper[4851]: I1001 12:57:01.238955 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"c0cb68eadfaa5fa243dbc130588998253b08c77917d93919066ac304838046a9"} Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.127854 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" podUID="86146cb0-0a54-4309-8345-565e2da39442" containerName="oauth-openshift" containerID="cri-o://82afd1baebccdbd41b926a7827b4079a155356586b9326a290c6cb6542e1507b" gracePeriod=15 Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.291156 4851 generic.go:334] "Generic (PLEG): container finished" podID="86146cb0-0a54-4309-8345-565e2da39442" containerID="82afd1baebccdbd41b926a7827b4079a155356586b9326a290c6cb6542e1507b" exitCode=0 Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.291317 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" event={"ID":"86146cb0-0a54-4309-8345-565e2da39442","Type":"ContainerDied","Data":"82afd1baebccdbd41b926a7827b4079a155356586b9326a290c6cb6542e1507b"} Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.597449 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626146 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-55c7db9594-b25z2"] Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626417 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db44d49-9188-4af0-baa3-b612c4a83846" containerName="extract-utilities" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626447 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db44d49-9188-4af0-baa3-b612c4a83846" containerName="extract-utilities" Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626471 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8661b3-299a-4c38-95a6-4216fcb6cd3a" containerName="extract-content" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626483 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8661b3-299a-4c38-95a6-4216fcb6cd3a" containerName="extract-content" Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626520 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebac54f4-17ba-4305-aa02-a6676cd35731" containerName="extract-content" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626533 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebac54f4-17ba-4305-aa02-a6676cd35731" containerName="extract-content" Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626553 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8661b3-299a-4c38-95a6-4216fcb6cd3a" containerName="registry-server" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626566 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8661b3-299a-4c38-95a6-4216fcb6cd3a" containerName="registry-server" Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626583 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad71044-3b99-4ece-a9c2-82e9d072e0cd" containerName="pruner" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626593 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad71044-3b99-4ece-a9c2-82e9d072e0cd" containerName="pruner" Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626606 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db44d49-9188-4af0-baa3-b612c4a83846" containerName="extract-content" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626617 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db44d49-9188-4af0-baa3-b612c4a83846" containerName="extract-content" Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626629 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c27542f4-64a4-4173-a6d3-e985304f1630" containerName="pruner" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626639 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="c27542f4-64a4-4173-a6d3-e985304f1630" containerName="pruner" Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626658 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86146cb0-0a54-4309-8345-565e2da39442" containerName="oauth-openshift" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626668 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="86146cb0-0a54-4309-8345-565e2da39442" containerName="oauth-openshift" Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626684 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebac54f4-17ba-4305-aa02-a6676cd35731" containerName="extract-utilities" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626695 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebac54f4-17ba-4305-aa02-a6676cd35731" containerName="extract-utilities" Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626710 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" containerName="extract-utilities" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626721 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" containerName="extract-utilities" Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626732 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebac54f4-17ba-4305-aa02-a6676cd35731" containerName="registry-server" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626743 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebac54f4-17ba-4305-aa02-a6676cd35731" containerName="registry-server" Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626762 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" containerName="extract-content" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626773 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" containerName="extract-content" Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626789 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" containerName="registry-server" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626800 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" containerName="registry-server" Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626815 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db44d49-9188-4af0-baa3-b612c4a83846" containerName="registry-server" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626825 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db44d49-9188-4af0-baa3-b612c4a83846" containerName="registry-server" Oct 01 12:57:06 crc kubenswrapper[4851]: E1001 12:57:06.626842 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8661b3-299a-4c38-95a6-4216fcb6cd3a" containerName="extract-utilities" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626853 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8661b3-299a-4c38-95a6-4216fcb6cd3a" containerName="extract-utilities" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.626999 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d8661b3-299a-4c38-95a6-4216fcb6cd3a" containerName="registry-server" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.627016 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="86146cb0-0a54-4309-8345-565e2da39442" containerName="oauth-openshift" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.627035 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="c27542f4-64a4-4173-a6d3-e985304f1630" containerName="pruner" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.627055 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad71044-3b99-4ece-a9c2-82e9d072e0cd" containerName="pruner" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.627069 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebac54f4-17ba-4305-aa02-a6676cd35731" containerName="registry-server" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.627085 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db44d49-9188-4af0-baa3-b612c4a83846" containerName="registry-server" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.627099 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f957df-ff36-44c7-b2fc-5d2bf10c87fa" containerName="registry-server" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.627695 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.645457 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-55c7db9594-b25z2"] Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.778279 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-session\") pod \"86146cb0-0a54-4309-8345-565e2da39442\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.778353 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-error\") pod \"86146cb0-0a54-4309-8345-565e2da39442\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.778393 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86146cb0-0a54-4309-8345-565e2da39442-audit-dir\") pod \"86146cb0-0a54-4309-8345-565e2da39442\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.778450 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-service-ca\") pod \"86146cb0-0a54-4309-8345-565e2da39442\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.778488 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-audit-policies\") pod \"86146cb0-0a54-4309-8345-565e2da39442\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.778574 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-cliconfig\") pod \"86146cb0-0a54-4309-8345-565e2da39442\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.778624 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-serving-cert\") pod \"86146cb0-0a54-4309-8345-565e2da39442\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.778659 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krwjm\" (UniqueName: \"kubernetes.io/projected/86146cb0-0a54-4309-8345-565e2da39442-kube-api-access-krwjm\") pod \"86146cb0-0a54-4309-8345-565e2da39442\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.778697 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-login\") pod \"86146cb0-0a54-4309-8345-565e2da39442\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.778749 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-provider-selection\") pod \"86146cb0-0a54-4309-8345-565e2da39442\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.778806 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-idp-0-file-data\") pod \"86146cb0-0a54-4309-8345-565e2da39442\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.778841 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-router-certs\") pod \"86146cb0-0a54-4309-8345-565e2da39442\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.778879 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-trusted-ca-bundle\") pod \"86146cb0-0a54-4309-8345-565e2da39442\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.778929 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-ocp-branding-template\") pod \"86146cb0-0a54-4309-8345-565e2da39442\" (UID: \"86146cb0-0a54-4309-8345-565e2da39442\") " Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.779226 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.779202 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86146cb0-0a54-4309-8345-565e2da39442-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "86146cb0-0a54-4309-8345-565e2da39442" (UID: "86146cb0-0a54-4309-8345-565e2da39442"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.779282 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c0b00fa-3a96-4697-8684-0f9005b676a1-audit-policies\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.779620 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.779936 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-service-ca\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.780041 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-router-certs\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.780101 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-session\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.780149 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.780192 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.780244 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "86146cb0-0a54-4309-8345-565e2da39442" (UID: "86146cb0-0a54-4309-8345-565e2da39442"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.780294 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "86146cb0-0a54-4309-8345-565e2da39442" (UID: "86146cb0-0a54-4309-8345-565e2da39442"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.780315 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "86146cb0-0a54-4309-8345-565e2da39442" (UID: "86146cb0-0a54-4309-8345-565e2da39442"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.780333 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.780439 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.780592 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-user-template-error\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.780734 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjlxg\" (UniqueName: \"kubernetes.io/projected/3c0b00fa-3a96-4697-8684-0f9005b676a1-kube-api-access-jjlxg\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.780795 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-user-template-login\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.780876 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b00fa-3a96-4697-8684-0f9005b676a1-audit-dir\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.781020 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.781058 4851 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.781090 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.781121 4851 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86146cb0-0a54-4309-8345-565e2da39442-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.781834 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "86146cb0-0a54-4309-8345-565e2da39442" (UID: "86146cb0-0a54-4309-8345-565e2da39442"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.785433 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "86146cb0-0a54-4309-8345-565e2da39442" (UID: "86146cb0-0a54-4309-8345-565e2da39442"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.785706 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "86146cb0-0a54-4309-8345-565e2da39442" (UID: "86146cb0-0a54-4309-8345-565e2da39442"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.787086 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "86146cb0-0a54-4309-8345-565e2da39442" (UID: "86146cb0-0a54-4309-8345-565e2da39442"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.788196 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86146cb0-0a54-4309-8345-565e2da39442-kube-api-access-krwjm" (OuterVolumeSpecName: "kube-api-access-krwjm") pod "86146cb0-0a54-4309-8345-565e2da39442" (UID: "86146cb0-0a54-4309-8345-565e2da39442"). InnerVolumeSpecName "kube-api-access-krwjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.788835 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "86146cb0-0a54-4309-8345-565e2da39442" (UID: "86146cb0-0a54-4309-8345-565e2da39442"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.789175 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "86146cb0-0a54-4309-8345-565e2da39442" (UID: "86146cb0-0a54-4309-8345-565e2da39442"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.789709 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "86146cb0-0a54-4309-8345-565e2da39442" (UID: "86146cb0-0a54-4309-8345-565e2da39442"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.790087 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "86146cb0-0a54-4309-8345-565e2da39442" (UID: "86146cb0-0a54-4309-8345-565e2da39442"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.791442 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "86146cb0-0a54-4309-8345-565e2da39442" (UID: "86146cb0-0a54-4309-8345-565e2da39442"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.882175 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b00fa-3a96-4697-8684-0f9005b676a1-audit-dir\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.882297 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.882374 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c0b00fa-3a96-4697-8684-0f9005b676a1-audit-policies\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.882403 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b00fa-3a96-4697-8684-0f9005b676a1-audit-dir\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.882423 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.882639 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-service-ca\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.882715 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-router-certs\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.882756 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-session\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.882809 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.882856 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.882895 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.882939 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.883004 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-user-template-error\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.883097 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjlxg\" (UniqueName: \"kubernetes.io/projected/3c0b00fa-3a96-4697-8684-0f9005b676a1-kube-api-access-jjlxg\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.883139 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-user-template-login\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.883267 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.883292 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.883318 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.883339 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.883361 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krwjm\" (UniqueName: \"kubernetes.io/projected/86146cb0-0a54-4309-8345-565e2da39442-kube-api-access-krwjm\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.883381 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.883403 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.883482 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.883528 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.883552 4851 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/86146cb0-0a54-4309-8345-565e2da39442-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.884576 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c0b00fa-3a96-4697-8684-0f9005b676a1-audit-policies\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.884666 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.884690 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-service-ca\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.884791 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.887919 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.888771 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-session\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.889241 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-user-template-login\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.889689 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.890328 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-router-certs\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.890900 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.891149 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-user-template-error\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.891381 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c0b00fa-3a96-4697-8684-0f9005b676a1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.912704 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjlxg\" (UniqueName: \"kubernetes.io/projected/3c0b00fa-3a96-4697-8684-0f9005b676a1-kube-api-access-jjlxg\") pod \"oauth-openshift-55c7db9594-b25z2\" (UID: \"3c0b00fa-3a96-4697-8684-0f9005b676a1\") " pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:06 crc kubenswrapper[4851]: I1001 12:57:06.944034 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:07 crc kubenswrapper[4851]: I1001 12:57:07.303950 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" event={"ID":"86146cb0-0a54-4309-8345-565e2da39442","Type":"ContainerDied","Data":"f4005abaf17d20fb4518a3ea738bcc70e639fbbe57dcd3f12f2e659b106c2073"} Oct 01 12:57:07 crc kubenswrapper[4851]: I1001 12:57:07.304066 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6z8fq" Oct 01 12:57:07 crc kubenswrapper[4851]: I1001 12:57:07.304134 4851 scope.go:117] "RemoveContainer" containerID="82afd1baebccdbd41b926a7827b4079a155356586b9326a290c6cb6542e1507b" Oct 01 12:57:07 crc kubenswrapper[4851]: I1001 12:57:07.341753 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6z8fq"] Oct 01 12:57:07 crc kubenswrapper[4851]: I1001 12:57:07.345058 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6z8fq"] Oct 01 12:57:07 crc kubenswrapper[4851]: I1001 12:57:07.432757 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-55c7db9594-b25z2"] Oct 01 12:57:08 crc kubenswrapper[4851]: I1001 12:57:08.313907 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" event={"ID":"3c0b00fa-3a96-4697-8684-0f9005b676a1","Type":"ContainerStarted","Data":"db3e98aa2f4831b33e64b2f5ce3c18d2656cf8fe965c2e9a6f43cc5da12a0cd7"} Oct 01 12:57:08 crc kubenswrapper[4851]: I1001 12:57:08.314249 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" event={"ID":"3c0b00fa-3a96-4697-8684-0f9005b676a1","Type":"ContainerStarted","Data":"0eabf3722af3a1983cc0113e05926c03ee2adb09c8df5047bf38ef97428654cf"} Oct 01 12:57:08 crc kubenswrapper[4851]: I1001 12:57:08.314271 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:08 crc kubenswrapper[4851]: I1001 12:57:08.321267 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" Oct 01 12:57:08 crc kubenswrapper[4851]: I1001 12:57:08.339374 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86146cb0-0a54-4309-8345-565e2da39442" path="/var/lib/kubelet/pods/86146cb0-0a54-4309-8345-565e2da39442/volumes" Oct 01 12:57:08 crc kubenswrapper[4851]: I1001 12:57:08.343570 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-55c7db9594-b25z2" podStartSLOduration=27.343544177 podStartE2EDuration="27.343544177s" podCreationTimestamp="2025-10-01 12:56:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:57:08.340859711 +0000 UTC m=+236.685977257" watchObservedRunningTime="2025-10-01 12:57:08.343544177 +0000 UTC m=+236.688661703" Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.563078 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8b48t"] Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.564161 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8b48t" podUID="9867a8d1-19de-4887-a5da-5d13588544c0" containerName="registry-server" containerID="cri-o://3863a599b41d770a0d56f379581a195e84cd518eeba5c2b0c2487285bd0cd119" gracePeriod=30 Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.566588 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xvh4p"] Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.566860 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xvh4p" podUID="ed8f6b65-9192-41e4-80dd-81688a0714a8" containerName="registry-server" containerID="cri-o://92a419962e5d269d99879f7c0cbab5d91032658846ea9112b8b898dd5676d918" gracePeriod=30 Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.581467 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7hnt8"] Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.581725 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" podUID="e723ab1e-2b65-4236-91f4-1dbae3e4acf7" containerName="marketplace-operator" containerID="cri-o://e207030b670b6f4efff60bd08a6a9a867c5484b60b7df19f3e2fcb4b31df99ec" gracePeriod=30 Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.595564 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-msbpg"] Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.595918 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-msbpg" podUID="04430c41-7121-4c5d-803b-db06d0dd7237" containerName="registry-server" containerID="cri-o://5ecebf7d8998720bda03d2040ff441c8dc1303ac7da82310b51b756a1ecb5ada" gracePeriod=30 Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.599651 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-97jrf"] Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.599950 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-97jrf" podUID="0390df89-e5c5-4684-8ef5-3006aed29cd6" containerName="registry-server" containerID="cri-o://41e33532d385dadaba6ed68d92efeca67ec5c0b6719d4aa3e5735fea3bf7b433" gracePeriod=30 Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.627036 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xtrc2"] Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.628663 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.663732 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xtrc2"] Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.777427 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-httcs\" (UniqueName: \"kubernetes.io/projected/0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d-kube-api-access-httcs\") pod \"marketplace-operator-79b997595-xtrc2\" (UID: \"0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.777606 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xtrc2\" (UID: \"0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.777699 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xtrc2\" (UID: \"0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.878723 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xtrc2\" (UID: \"0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.878780 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xtrc2\" (UID: \"0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.878854 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-httcs\" (UniqueName: \"kubernetes.io/projected/0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d-kube-api-access-httcs\") pod \"marketplace-operator-79b997595-xtrc2\" (UID: \"0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.880061 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xtrc2\" (UID: \"0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.884775 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xtrc2\" (UID: \"0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.895097 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-httcs\" (UniqueName: \"kubernetes.io/projected/0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d-kube-api-access-httcs\") pod \"marketplace-operator-79b997595-xtrc2\" (UID: \"0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" Oct 01 12:57:21 crc kubenswrapper[4851]: I1001 12:57:21.959360 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" Oct 01 12:57:22 crc kubenswrapper[4851]: E1001 12:57:22.295421 4851 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ecebf7d8998720bda03d2040ff441c8dc1303ac7da82310b51b756a1ecb5ada is running failed: container process not found" containerID="5ecebf7d8998720bda03d2040ff441c8dc1303ac7da82310b51b756a1ecb5ada" cmd=["grpc_health_probe","-addr=:50051"] Oct 01 12:57:22 crc kubenswrapper[4851]: E1001 12:57:22.295753 4851 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ecebf7d8998720bda03d2040ff441c8dc1303ac7da82310b51b756a1ecb5ada is running failed: container process not found" containerID="5ecebf7d8998720bda03d2040ff441c8dc1303ac7da82310b51b756a1ecb5ada" cmd=["grpc_health_probe","-addr=:50051"] Oct 01 12:57:22 crc kubenswrapper[4851]: E1001 12:57:22.296063 4851 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ecebf7d8998720bda03d2040ff441c8dc1303ac7da82310b51b756a1ecb5ada is running failed: container process not found" containerID="5ecebf7d8998720bda03d2040ff441c8dc1303ac7da82310b51b756a1ecb5ada" cmd=["grpc_health_probe","-addr=:50051"] Oct 01 12:57:22 crc kubenswrapper[4851]: E1001 12:57:22.296091 4851 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ecebf7d8998720bda03d2040ff441c8dc1303ac7da82310b51b756a1ecb5ada is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-msbpg" podUID="04430c41-7121-4c5d-803b-db06d0dd7237" containerName="registry-server" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.352569 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xtrc2"] Oct 01 12:57:22 crc kubenswrapper[4851]: W1001 12:57:22.361900 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e26c58b_4c8a_4b07_96d9_7dea3f32ed8d.slice/crio-22aec337d702d38054d1a4fc340b852885f630695204660450738e8d667287bf WatchSource:0}: Error finding container 22aec337d702d38054d1a4fc340b852885f630695204660450738e8d667287bf: Status 404 returned error can't find the container with id 22aec337d702d38054d1a4fc340b852885f630695204660450738e8d667287bf Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.406490 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" event={"ID":"0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d","Type":"ContainerStarted","Data":"22aec337d702d38054d1a4fc340b852885f630695204660450738e8d667287bf"} Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.411348 4851 generic.go:334] "Generic (PLEG): container finished" podID="0390df89-e5c5-4684-8ef5-3006aed29cd6" containerID="41e33532d385dadaba6ed68d92efeca67ec5c0b6719d4aa3e5735fea3bf7b433" exitCode=0 Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.411455 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97jrf" event={"ID":"0390df89-e5c5-4684-8ef5-3006aed29cd6","Type":"ContainerDied","Data":"41e33532d385dadaba6ed68d92efeca67ec5c0b6719d4aa3e5735fea3bf7b433"} Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.415829 4851 generic.go:334] "Generic (PLEG): container finished" podID="04430c41-7121-4c5d-803b-db06d0dd7237" containerID="5ecebf7d8998720bda03d2040ff441c8dc1303ac7da82310b51b756a1ecb5ada" exitCode=0 Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.415897 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msbpg" event={"ID":"04430c41-7121-4c5d-803b-db06d0dd7237","Type":"ContainerDied","Data":"5ecebf7d8998720bda03d2040ff441c8dc1303ac7da82310b51b756a1ecb5ada"} Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.422176 4851 generic.go:334] "Generic (PLEG): container finished" podID="ed8f6b65-9192-41e4-80dd-81688a0714a8" containerID="92a419962e5d269d99879f7c0cbab5d91032658846ea9112b8b898dd5676d918" exitCode=0 Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.422240 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvh4p" event={"ID":"ed8f6b65-9192-41e4-80dd-81688a0714a8","Type":"ContainerDied","Data":"92a419962e5d269d99879f7c0cbab5d91032658846ea9112b8b898dd5676d918"} Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.423286 4851 generic.go:334] "Generic (PLEG): container finished" podID="e723ab1e-2b65-4236-91f4-1dbae3e4acf7" containerID="e207030b670b6f4efff60bd08a6a9a867c5484b60b7df19f3e2fcb4b31df99ec" exitCode=0 Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.423342 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" event={"ID":"e723ab1e-2b65-4236-91f4-1dbae3e4acf7","Type":"ContainerDied","Data":"e207030b670b6f4efff60bd08a6a9a867c5484b60b7df19f3e2fcb4b31df99ec"} Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.425006 4851 generic.go:334] "Generic (PLEG): container finished" podID="9867a8d1-19de-4887-a5da-5d13588544c0" containerID="3863a599b41d770a0d56f379581a195e84cd518eeba5c2b0c2487285bd0cd119" exitCode=0 Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.425030 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8b48t" event={"ID":"9867a8d1-19de-4887-a5da-5d13588544c0","Type":"ContainerDied","Data":"3863a599b41d770a0d56f379581a195e84cd518eeba5c2b0c2487285bd0cd119"} Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.473625 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.567750 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.569205 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.570564 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.578465 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.588399 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9867a8d1-19de-4887-a5da-5d13588544c0-catalog-content\") pod \"9867a8d1-19de-4887-a5da-5d13588544c0\" (UID: \"9867a8d1-19de-4887-a5da-5d13588544c0\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.588466 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txm9r\" (UniqueName: \"kubernetes.io/projected/9867a8d1-19de-4887-a5da-5d13588544c0-kube-api-access-txm9r\") pod \"9867a8d1-19de-4887-a5da-5d13588544c0\" (UID: \"9867a8d1-19de-4887-a5da-5d13588544c0\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.588536 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9867a8d1-19de-4887-a5da-5d13588544c0-utilities\") pod \"9867a8d1-19de-4887-a5da-5d13588544c0\" (UID: \"9867a8d1-19de-4887-a5da-5d13588544c0\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.589534 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9867a8d1-19de-4887-a5da-5d13588544c0-utilities" (OuterVolumeSpecName: "utilities") pod "9867a8d1-19de-4887-a5da-5d13588544c0" (UID: "9867a8d1-19de-4887-a5da-5d13588544c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.596321 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9867a8d1-19de-4887-a5da-5d13588544c0-kube-api-access-txm9r" (OuterVolumeSpecName: "kube-api-access-txm9r") pod "9867a8d1-19de-4887-a5da-5d13588544c0" (UID: "9867a8d1-19de-4887-a5da-5d13588544c0"). InnerVolumeSpecName "kube-api-access-txm9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.656704 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9867a8d1-19de-4887-a5da-5d13588544c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9867a8d1-19de-4887-a5da-5d13588544c0" (UID: "9867a8d1-19de-4887-a5da-5d13588544c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689280 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04430c41-7121-4c5d-803b-db06d0dd7237-utilities\") pod \"04430c41-7121-4c5d-803b-db06d0dd7237\" (UID: \"04430c41-7121-4c5d-803b-db06d0dd7237\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689330 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04430c41-7121-4c5d-803b-db06d0dd7237-catalog-content\") pod \"04430c41-7121-4c5d-803b-db06d0dd7237\" (UID: \"04430c41-7121-4c5d-803b-db06d0dd7237\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689382 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-marketplace-trusted-ca\") pod \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\" (UID: \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689406 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8f6b65-9192-41e4-80dd-81688a0714a8-catalog-content\") pod \"ed8f6b65-9192-41e4-80dd-81688a0714a8\" (UID: \"ed8f6b65-9192-41e4-80dd-81688a0714a8\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689425 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8f6b65-9192-41e4-80dd-81688a0714a8-utilities\") pod \"ed8f6b65-9192-41e4-80dd-81688a0714a8\" (UID: \"ed8f6b65-9192-41e4-80dd-81688a0714a8\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689447 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgkjw\" (UniqueName: \"kubernetes.io/projected/04430c41-7121-4c5d-803b-db06d0dd7237-kube-api-access-bgkjw\") pod \"04430c41-7121-4c5d-803b-db06d0dd7237\" (UID: \"04430c41-7121-4c5d-803b-db06d0dd7237\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689463 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76qf9\" (UniqueName: \"kubernetes.io/projected/0390df89-e5c5-4684-8ef5-3006aed29cd6-kube-api-access-76qf9\") pod \"0390df89-e5c5-4684-8ef5-3006aed29cd6\" (UID: \"0390df89-e5c5-4684-8ef5-3006aed29cd6\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689489 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0390df89-e5c5-4684-8ef5-3006aed29cd6-utilities\") pod \"0390df89-e5c5-4684-8ef5-3006aed29cd6\" (UID: \"0390df89-e5c5-4684-8ef5-3006aed29cd6\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689526 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-marketplace-operator-metrics\") pod \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\" (UID: \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689547 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0390df89-e5c5-4684-8ef5-3006aed29cd6-catalog-content\") pod \"0390df89-e5c5-4684-8ef5-3006aed29cd6\" (UID: \"0390df89-e5c5-4684-8ef5-3006aed29cd6\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689569 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88g9p\" (UniqueName: \"kubernetes.io/projected/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-kube-api-access-88g9p\") pod \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\" (UID: \"e723ab1e-2b65-4236-91f4-1dbae3e4acf7\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689585 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv5vk\" (UniqueName: \"kubernetes.io/projected/ed8f6b65-9192-41e4-80dd-81688a0714a8-kube-api-access-vv5vk\") pod \"ed8f6b65-9192-41e4-80dd-81688a0714a8\" (UID: \"ed8f6b65-9192-41e4-80dd-81688a0714a8\") " Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689748 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txm9r\" (UniqueName: \"kubernetes.io/projected/9867a8d1-19de-4887-a5da-5d13588544c0-kube-api-access-txm9r\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689760 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9867a8d1-19de-4887-a5da-5d13588544c0-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.689768 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9867a8d1-19de-4887-a5da-5d13588544c0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.690576 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "e723ab1e-2b65-4236-91f4-1dbae3e4acf7" (UID: "e723ab1e-2b65-4236-91f4-1dbae3e4acf7"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.691233 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04430c41-7121-4c5d-803b-db06d0dd7237-utilities" (OuterVolumeSpecName: "utilities") pod "04430c41-7121-4c5d-803b-db06d0dd7237" (UID: "04430c41-7121-4c5d-803b-db06d0dd7237"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.692237 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed8f6b65-9192-41e4-80dd-81688a0714a8-utilities" (OuterVolumeSpecName: "utilities") pod "ed8f6b65-9192-41e4-80dd-81688a0714a8" (UID: "ed8f6b65-9192-41e4-80dd-81688a0714a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.693813 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04430c41-7121-4c5d-803b-db06d0dd7237-kube-api-access-bgkjw" (OuterVolumeSpecName: "kube-api-access-bgkjw") pod "04430c41-7121-4c5d-803b-db06d0dd7237" (UID: "04430c41-7121-4c5d-803b-db06d0dd7237"). InnerVolumeSpecName "kube-api-access-bgkjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.694188 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0390df89-e5c5-4684-8ef5-3006aed29cd6-utilities" (OuterVolumeSpecName: "utilities") pod "0390df89-e5c5-4684-8ef5-3006aed29cd6" (UID: "0390df89-e5c5-4684-8ef5-3006aed29cd6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.694995 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "e723ab1e-2b65-4236-91f4-1dbae3e4acf7" (UID: "e723ab1e-2b65-4236-91f4-1dbae3e4acf7"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.695904 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed8f6b65-9192-41e4-80dd-81688a0714a8-kube-api-access-vv5vk" (OuterVolumeSpecName: "kube-api-access-vv5vk") pod "ed8f6b65-9192-41e4-80dd-81688a0714a8" (UID: "ed8f6b65-9192-41e4-80dd-81688a0714a8"). InnerVolumeSpecName "kube-api-access-vv5vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.697348 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-kube-api-access-88g9p" (OuterVolumeSpecName: "kube-api-access-88g9p") pod "e723ab1e-2b65-4236-91f4-1dbae3e4acf7" (UID: "e723ab1e-2b65-4236-91f4-1dbae3e4acf7"). InnerVolumeSpecName "kube-api-access-88g9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.697581 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0390df89-e5c5-4684-8ef5-3006aed29cd6-kube-api-access-76qf9" (OuterVolumeSpecName: "kube-api-access-76qf9") pod "0390df89-e5c5-4684-8ef5-3006aed29cd6" (UID: "0390df89-e5c5-4684-8ef5-3006aed29cd6"). InnerVolumeSpecName "kube-api-access-76qf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.705589 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04430c41-7121-4c5d-803b-db06d0dd7237-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04430c41-7121-4c5d-803b-db06d0dd7237" (UID: "04430c41-7121-4c5d-803b-db06d0dd7237"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.765861 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed8f6b65-9192-41e4-80dd-81688a0714a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed8f6b65-9192-41e4-80dd-81688a0714a8" (UID: "ed8f6b65-9192-41e4-80dd-81688a0714a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.783211 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0390df89-e5c5-4684-8ef5-3006aed29cd6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0390df89-e5c5-4684-8ef5-3006aed29cd6" (UID: "0390df89-e5c5-4684-8ef5-3006aed29cd6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.790906 4851 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.790954 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8f6b65-9192-41e4-80dd-81688a0714a8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.790964 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8f6b65-9192-41e4-80dd-81688a0714a8-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.790973 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgkjw\" (UniqueName: \"kubernetes.io/projected/04430c41-7121-4c5d-803b-db06d0dd7237-kube-api-access-bgkjw\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.790984 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76qf9\" (UniqueName: \"kubernetes.io/projected/0390df89-e5c5-4684-8ef5-3006aed29cd6-kube-api-access-76qf9\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.790992 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0390df89-e5c5-4684-8ef5-3006aed29cd6-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.791001 4851 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.791012 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0390df89-e5c5-4684-8ef5-3006aed29cd6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.791034 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88g9p\" (UniqueName: \"kubernetes.io/projected/e723ab1e-2b65-4236-91f4-1dbae3e4acf7-kube-api-access-88g9p\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.791044 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv5vk\" (UniqueName: \"kubernetes.io/projected/ed8f6b65-9192-41e4-80dd-81688a0714a8-kube-api-access-vv5vk\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.791051 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04430c41-7121-4c5d-803b-db06d0dd7237-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:22 crc kubenswrapper[4851]: I1001 12:57:22.791059 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04430c41-7121-4c5d-803b-db06d0dd7237-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.432975 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97jrf" event={"ID":"0390df89-e5c5-4684-8ef5-3006aed29cd6","Type":"ContainerDied","Data":"c3b572492ae3da9258ca83baa95e4daaec33fb69d00a9be14b8116334f251fb9"} Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.433474 4851 scope.go:117] "RemoveContainer" containerID="41e33532d385dadaba6ed68d92efeca67ec5c0b6719d4aa3e5735fea3bf7b433" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.434277 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97jrf" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.436352 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msbpg" event={"ID":"04430c41-7121-4c5d-803b-db06d0dd7237","Type":"ContainerDied","Data":"e2cd39b564ed0181016a806b62c8792fcf575f2deeaca3da1e69c555814a70c4"} Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.436455 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-msbpg" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.439664 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvh4p" event={"ID":"ed8f6b65-9192-41e4-80dd-81688a0714a8","Type":"ContainerDied","Data":"98badfbd244239b72d57bafb87c60490d2047d9ac56c52018f0988a13bb74627"} Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.439816 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvh4p" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.445806 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8b48t" event={"ID":"9867a8d1-19de-4887-a5da-5d13588544c0","Type":"ContainerDied","Data":"b9b612cbec5a285599ef58df8c5ba3edb5d94490b29f8b3182e929d1861a32b9"} Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.445939 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8b48t" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.448683 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" event={"ID":"e723ab1e-2b65-4236-91f4-1dbae3e4acf7","Type":"ContainerDied","Data":"540075bda7660bb664eea8c4e9cfa15ece5148c66d5d40e83749872232037d54"} Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.448742 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7hnt8" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.450150 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" event={"ID":"0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d","Type":"ContainerStarted","Data":"a2ab5ceaf9363a710b31e0f8eb73482e21d715043c452605f760aa65762b7181"} Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.450458 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.455093 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.463142 4851 scope.go:117] "RemoveContainer" containerID="2ef89ae48bf3050785e20caaf1e41bdfffde25761d68b539c6fe7a74d689f4f1" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.480010 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xtrc2" podStartSLOduration=2.47999274 podStartE2EDuration="2.47999274s" podCreationTimestamp="2025-10-01 12:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:57:23.474008209 +0000 UTC m=+251.819125725" watchObservedRunningTime="2025-10-01 12:57:23.47999274 +0000 UTC m=+251.825110236" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.503236 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-97jrf"] Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.503846 4851 scope.go:117] "RemoveContainer" containerID="3085d853ca175611b856f9dfc8f0cd377bcc4a07f744db31fff05cfe986c217c" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.509718 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-97jrf"] Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.519365 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-msbpg"] Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.534311 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-msbpg"] Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.535782 4851 scope.go:117] "RemoveContainer" containerID="5ecebf7d8998720bda03d2040ff441c8dc1303ac7da82310b51b756a1ecb5ada" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.567626 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xvh4p"] Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.576320 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xvh4p"] Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.585334 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8b48t"] Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.585618 4851 scope.go:117] "RemoveContainer" containerID="fbb54e6a60dfa104ee5a5315d0194e3715fb29fe2262f18954ee39a85c1547d1" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.588946 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8b48t"] Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.609712 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7hnt8"] Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.609946 4851 scope.go:117] "RemoveContainer" containerID="1ef475d8ef0f9ef1b28d83cb0fff8ad86b4eb47775afdc528533ae3bf9aae00e" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.612419 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7hnt8"] Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.623889 4851 scope.go:117] "RemoveContainer" containerID="92a419962e5d269d99879f7c0cbab5d91032658846ea9112b8b898dd5676d918" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.635209 4851 scope.go:117] "RemoveContainer" containerID="6067ea7853a4b226f755d78ae40358fc04609e319c688f6ca97bbee3eb258cdf" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.648431 4851 scope.go:117] "RemoveContainer" containerID="3d236a2b04cfc4baa6a5dabee673aa329077f504020635bbf5b31013e213dd8f" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.658997 4851 scope.go:117] "RemoveContainer" containerID="3863a599b41d770a0d56f379581a195e84cd518eeba5c2b0c2487285bd0cd119" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.671268 4851 scope.go:117] "RemoveContainer" containerID="4f1826a85dfddccef1ad3d5b32a584b282fa85e080e0ea4006f9694f78131923" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.713660 4851 scope.go:117] "RemoveContainer" containerID="89179ee4f4ea0d8f7c4f8a781b87d4712e51266ddf4e8de081498d5a7ce8c826" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.735912 4851 scope.go:117] "RemoveContainer" containerID="e207030b670b6f4efff60bd08a6a9a867c5484b60b7df19f3e2fcb4b31df99ec" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.967911 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7cggq"] Oct 01 12:57:23 crc kubenswrapper[4851]: E1001 12:57:23.968088 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e723ab1e-2b65-4236-91f4-1dbae3e4acf7" containerName="marketplace-operator" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968100 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="e723ab1e-2b65-4236-91f4-1dbae3e4acf7" containerName="marketplace-operator" Oct 01 12:57:23 crc kubenswrapper[4851]: E1001 12:57:23.968108 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04430c41-7121-4c5d-803b-db06d0dd7237" containerName="registry-server" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968114 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="04430c41-7121-4c5d-803b-db06d0dd7237" containerName="registry-server" Oct 01 12:57:23 crc kubenswrapper[4851]: E1001 12:57:23.968123 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0390df89-e5c5-4684-8ef5-3006aed29cd6" containerName="registry-server" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968129 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="0390df89-e5c5-4684-8ef5-3006aed29cd6" containerName="registry-server" Oct 01 12:57:23 crc kubenswrapper[4851]: E1001 12:57:23.968138 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0390df89-e5c5-4684-8ef5-3006aed29cd6" containerName="extract-utilities" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968144 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="0390df89-e5c5-4684-8ef5-3006aed29cd6" containerName="extract-utilities" Oct 01 12:57:23 crc kubenswrapper[4851]: E1001 12:57:23.968154 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04430c41-7121-4c5d-803b-db06d0dd7237" containerName="extract-utilities" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968159 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="04430c41-7121-4c5d-803b-db06d0dd7237" containerName="extract-utilities" Oct 01 12:57:23 crc kubenswrapper[4851]: E1001 12:57:23.968169 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04430c41-7121-4c5d-803b-db06d0dd7237" containerName="extract-content" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968175 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="04430c41-7121-4c5d-803b-db06d0dd7237" containerName="extract-content" Oct 01 12:57:23 crc kubenswrapper[4851]: E1001 12:57:23.968186 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9867a8d1-19de-4887-a5da-5d13588544c0" containerName="extract-content" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968193 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="9867a8d1-19de-4887-a5da-5d13588544c0" containerName="extract-content" Oct 01 12:57:23 crc kubenswrapper[4851]: E1001 12:57:23.968201 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9867a8d1-19de-4887-a5da-5d13588544c0" containerName="extract-utilities" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968207 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="9867a8d1-19de-4887-a5da-5d13588544c0" containerName="extract-utilities" Oct 01 12:57:23 crc kubenswrapper[4851]: E1001 12:57:23.968214 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9867a8d1-19de-4887-a5da-5d13588544c0" containerName="registry-server" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968220 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="9867a8d1-19de-4887-a5da-5d13588544c0" containerName="registry-server" Oct 01 12:57:23 crc kubenswrapper[4851]: E1001 12:57:23.968229 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8f6b65-9192-41e4-80dd-81688a0714a8" containerName="registry-server" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968234 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8f6b65-9192-41e4-80dd-81688a0714a8" containerName="registry-server" Oct 01 12:57:23 crc kubenswrapper[4851]: E1001 12:57:23.968242 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8f6b65-9192-41e4-80dd-81688a0714a8" containerName="extract-utilities" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968248 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8f6b65-9192-41e4-80dd-81688a0714a8" containerName="extract-utilities" Oct 01 12:57:23 crc kubenswrapper[4851]: E1001 12:57:23.968259 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0390df89-e5c5-4684-8ef5-3006aed29cd6" containerName="extract-content" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968265 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="0390df89-e5c5-4684-8ef5-3006aed29cd6" containerName="extract-content" Oct 01 12:57:23 crc kubenswrapper[4851]: E1001 12:57:23.968274 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8f6b65-9192-41e4-80dd-81688a0714a8" containerName="extract-content" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968281 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8f6b65-9192-41e4-80dd-81688a0714a8" containerName="extract-content" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968352 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="9867a8d1-19de-4887-a5da-5d13588544c0" containerName="registry-server" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968364 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="04430c41-7121-4c5d-803b-db06d0dd7237" containerName="registry-server" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968373 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="0390df89-e5c5-4684-8ef5-3006aed29cd6" containerName="registry-server" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968379 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="e723ab1e-2b65-4236-91f4-1dbae3e4acf7" containerName="marketplace-operator" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.968388 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed8f6b65-9192-41e4-80dd-81688a0714a8" containerName="registry-server" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.969107 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.971436 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 12:57:23 crc kubenswrapper[4851]: I1001 12:57:23.985720 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7cggq"] Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.110583 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd203b3c-a861-4dba-9df9-7328c541c294-utilities\") pod \"redhat-operators-7cggq\" (UID: \"dd203b3c-a861-4dba-9df9-7328c541c294\") " pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.110628 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd203b3c-a861-4dba-9df9-7328c541c294-catalog-content\") pod \"redhat-operators-7cggq\" (UID: \"dd203b3c-a861-4dba-9df9-7328c541c294\") " pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.110674 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh5z2\" (UniqueName: \"kubernetes.io/projected/dd203b3c-a861-4dba-9df9-7328c541c294-kube-api-access-xh5z2\") pod \"redhat-operators-7cggq\" (UID: \"dd203b3c-a861-4dba-9df9-7328c541c294\") " pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.211579 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh5z2\" (UniqueName: \"kubernetes.io/projected/dd203b3c-a861-4dba-9df9-7328c541c294-kube-api-access-xh5z2\") pod \"redhat-operators-7cggq\" (UID: \"dd203b3c-a861-4dba-9df9-7328c541c294\") " pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.211698 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd203b3c-a861-4dba-9df9-7328c541c294-utilities\") pod \"redhat-operators-7cggq\" (UID: \"dd203b3c-a861-4dba-9df9-7328c541c294\") " pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.211752 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd203b3c-a861-4dba-9df9-7328c541c294-catalog-content\") pod \"redhat-operators-7cggq\" (UID: \"dd203b3c-a861-4dba-9df9-7328c541c294\") " pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.212119 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd203b3c-a861-4dba-9df9-7328c541c294-utilities\") pod \"redhat-operators-7cggq\" (UID: \"dd203b3c-a861-4dba-9df9-7328c541c294\") " pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.212281 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd203b3c-a861-4dba-9df9-7328c541c294-catalog-content\") pod \"redhat-operators-7cggq\" (UID: \"dd203b3c-a861-4dba-9df9-7328c541c294\") " pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.231565 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh5z2\" (UniqueName: \"kubernetes.io/projected/dd203b3c-a861-4dba-9df9-7328c541c294-kube-api-access-xh5z2\") pod \"redhat-operators-7cggq\" (UID: \"dd203b3c-a861-4dba-9df9-7328c541c294\") " pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.289708 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.339141 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0390df89-e5c5-4684-8ef5-3006aed29cd6" path="/var/lib/kubelet/pods/0390df89-e5c5-4684-8ef5-3006aed29cd6/volumes" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.340393 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04430c41-7121-4c5d-803b-db06d0dd7237" path="/var/lib/kubelet/pods/04430c41-7121-4c5d-803b-db06d0dd7237/volumes" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.341261 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9867a8d1-19de-4887-a5da-5d13588544c0" path="/var/lib/kubelet/pods/9867a8d1-19de-4887-a5da-5d13588544c0/volumes" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.344010 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e723ab1e-2b65-4236-91f4-1dbae3e4acf7" path="/var/lib/kubelet/pods/e723ab1e-2b65-4236-91f4-1dbae3e4acf7/volumes" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.344636 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed8f6b65-9192-41e4-80dd-81688a0714a8" path="/var/lib/kubelet/pods/ed8f6b65-9192-41e4-80dd-81688a0714a8/volumes" Oct 01 12:57:24 crc kubenswrapper[4851]: I1001 12:57:24.722117 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7cggq"] Oct 01 12:57:24 crc kubenswrapper[4851]: W1001 12:57:24.722594 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd203b3c_a861_4dba_9df9_7328c541c294.slice/crio-a3ecadc3f2b6fadba7b481b44fa04eb47c9b2949c84a63e15f0dad3b7f0aa25d WatchSource:0}: Error finding container a3ecadc3f2b6fadba7b481b44fa04eb47c9b2949c84a63e15f0dad3b7f0aa25d: Status 404 returned error can't find the container with id a3ecadc3f2b6fadba7b481b44fa04eb47c9b2949c84a63e15f0dad3b7f0aa25d Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.469766 4851 generic.go:334] "Generic (PLEG): container finished" podID="dd203b3c-a861-4dba-9df9-7328c541c294" containerID="372be6bd50b706252f67c40bf146019dde8dcf14fc6a9e744fdf8fa8c4afef0a" exitCode=0 Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.469818 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7cggq" event={"ID":"dd203b3c-a861-4dba-9df9-7328c541c294","Type":"ContainerDied","Data":"372be6bd50b706252f67c40bf146019dde8dcf14fc6a9e744fdf8fa8c4afef0a"} Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.470168 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7cggq" event={"ID":"dd203b3c-a861-4dba-9df9-7328c541c294","Type":"ContainerStarted","Data":"a3ecadc3f2b6fadba7b481b44fa04eb47c9b2949c84a63e15f0dad3b7f0aa25d"} Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.766852 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jgwl2"] Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.769388 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.774281 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.779428 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jgwl2"] Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.831072 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9x5q\" (UniqueName: \"kubernetes.io/projected/60fbd453-e33b-463e-8dc0-e82251bdec0d-kube-api-access-j9x5q\") pod \"community-operators-jgwl2\" (UID: \"60fbd453-e33b-463e-8dc0-e82251bdec0d\") " pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.831154 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60fbd453-e33b-463e-8dc0-e82251bdec0d-utilities\") pod \"community-operators-jgwl2\" (UID: \"60fbd453-e33b-463e-8dc0-e82251bdec0d\") " pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.831207 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60fbd453-e33b-463e-8dc0-e82251bdec0d-catalog-content\") pod \"community-operators-jgwl2\" (UID: \"60fbd453-e33b-463e-8dc0-e82251bdec0d\") " pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.933052 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9x5q\" (UniqueName: \"kubernetes.io/projected/60fbd453-e33b-463e-8dc0-e82251bdec0d-kube-api-access-j9x5q\") pod \"community-operators-jgwl2\" (UID: \"60fbd453-e33b-463e-8dc0-e82251bdec0d\") " pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.933163 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60fbd453-e33b-463e-8dc0-e82251bdec0d-utilities\") pod \"community-operators-jgwl2\" (UID: \"60fbd453-e33b-463e-8dc0-e82251bdec0d\") " pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.933252 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60fbd453-e33b-463e-8dc0-e82251bdec0d-catalog-content\") pod \"community-operators-jgwl2\" (UID: \"60fbd453-e33b-463e-8dc0-e82251bdec0d\") " pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.933991 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60fbd453-e33b-463e-8dc0-e82251bdec0d-utilities\") pod \"community-operators-jgwl2\" (UID: \"60fbd453-e33b-463e-8dc0-e82251bdec0d\") " pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.934187 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60fbd453-e33b-463e-8dc0-e82251bdec0d-catalog-content\") pod \"community-operators-jgwl2\" (UID: \"60fbd453-e33b-463e-8dc0-e82251bdec0d\") " pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:25 crc kubenswrapper[4851]: I1001 12:57:25.955174 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9x5q\" (UniqueName: \"kubernetes.io/projected/60fbd453-e33b-463e-8dc0-e82251bdec0d-kube-api-access-j9x5q\") pod \"community-operators-jgwl2\" (UID: \"60fbd453-e33b-463e-8dc0-e82251bdec0d\") " pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.154389 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.369026 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9mgxt"] Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.370285 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.372438 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.390631 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9mgxt"] Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.446640 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70558de8-d877-4f24-a9ee-18f3696799a8-catalog-content\") pod \"certified-operators-9mgxt\" (UID: \"70558de8-d877-4f24-a9ee-18f3696799a8\") " pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.447075 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrlxz\" (UniqueName: \"kubernetes.io/projected/70558de8-d877-4f24-a9ee-18f3696799a8-kube-api-access-lrlxz\") pod \"certified-operators-9mgxt\" (UID: \"70558de8-d877-4f24-a9ee-18f3696799a8\") " pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.447306 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70558de8-d877-4f24-a9ee-18f3696799a8-utilities\") pod \"certified-operators-9mgxt\" (UID: \"70558de8-d877-4f24-a9ee-18f3696799a8\") " pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.550954 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70558de8-d877-4f24-a9ee-18f3696799a8-utilities\") pod \"certified-operators-9mgxt\" (UID: \"70558de8-d877-4f24-a9ee-18f3696799a8\") " pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.551212 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70558de8-d877-4f24-a9ee-18f3696799a8-catalog-content\") pod \"certified-operators-9mgxt\" (UID: \"70558de8-d877-4f24-a9ee-18f3696799a8\") " pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.551259 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrlxz\" (UniqueName: \"kubernetes.io/projected/70558de8-d877-4f24-a9ee-18f3696799a8-kube-api-access-lrlxz\") pod \"certified-operators-9mgxt\" (UID: \"70558de8-d877-4f24-a9ee-18f3696799a8\") " pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.551387 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70558de8-d877-4f24-a9ee-18f3696799a8-utilities\") pod \"certified-operators-9mgxt\" (UID: \"70558de8-d877-4f24-a9ee-18f3696799a8\") " pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.552033 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70558de8-d877-4f24-a9ee-18f3696799a8-catalog-content\") pod \"certified-operators-9mgxt\" (UID: \"70558de8-d877-4f24-a9ee-18f3696799a8\") " pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.557366 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jgwl2"] Oct 01 12:57:26 crc kubenswrapper[4851]: W1001 12:57:26.568748 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60fbd453_e33b_463e_8dc0_e82251bdec0d.slice/crio-7a5989ec835f88a4df4b9c0ffb8691b9253d3fc56ee52d1277d801f4f5072702 WatchSource:0}: Error finding container 7a5989ec835f88a4df4b9c0ffb8691b9253d3fc56ee52d1277d801f4f5072702: Status 404 returned error can't find the container with id 7a5989ec835f88a4df4b9c0ffb8691b9253d3fc56ee52d1277d801f4f5072702 Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.572975 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrlxz\" (UniqueName: \"kubernetes.io/projected/70558de8-d877-4f24-a9ee-18f3696799a8-kube-api-access-lrlxz\") pod \"certified-operators-9mgxt\" (UID: \"70558de8-d877-4f24-a9ee-18f3696799a8\") " pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.691477 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:26 crc kubenswrapper[4851]: I1001 12:57:26.972775 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9mgxt"] Oct 01 12:57:26 crc kubenswrapper[4851]: W1001 12:57:26.985953 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70558de8_d877_4f24_a9ee_18f3696799a8.slice/crio-241ae39a6d567ea58b1f8f7c3fb6151165e7b60b7be14cad2dc5c07856c2b42e WatchSource:0}: Error finding container 241ae39a6d567ea58b1f8f7c3fb6151165e7b60b7be14cad2dc5c07856c2b42e: Status 404 returned error can't find the container with id 241ae39a6d567ea58b1f8f7c3fb6151165e7b60b7be14cad2dc5c07856c2b42e Oct 01 12:57:27 crc kubenswrapper[4851]: I1001 12:57:27.485191 4851 generic.go:334] "Generic (PLEG): container finished" podID="60fbd453-e33b-463e-8dc0-e82251bdec0d" containerID="de2267088dd1418f67b73e01ea4d22ef50c1ab7579cdd18c135898dda4dbf7a0" exitCode=0 Oct 01 12:57:27 crc kubenswrapper[4851]: I1001 12:57:27.485433 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jgwl2" event={"ID":"60fbd453-e33b-463e-8dc0-e82251bdec0d","Type":"ContainerDied","Data":"de2267088dd1418f67b73e01ea4d22ef50c1ab7579cdd18c135898dda4dbf7a0"} Oct 01 12:57:27 crc kubenswrapper[4851]: I1001 12:57:27.485458 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jgwl2" event={"ID":"60fbd453-e33b-463e-8dc0-e82251bdec0d","Type":"ContainerStarted","Data":"7a5989ec835f88a4df4b9c0ffb8691b9253d3fc56ee52d1277d801f4f5072702"} Oct 01 12:57:27 crc kubenswrapper[4851]: I1001 12:57:27.491675 4851 generic.go:334] "Generic (PLEG): container finished" podID="70558de8-d877-4f24-a9ee-18f3696799a8" containerID="5168fafd2853a9a941f7a7f8f6120fd58ce55eee6afb2289f8cc91f85260d10c" exitCode=0 Oct 01 12:57:27 crc kubenswrapper[4851]: I1001 12:57:27.491774 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mgxt" event={"ID":"70558de8-d877-4f24-a9ee-18f3696799a8","Type":"ContainerDied","Data":"5168fafd2853a9a941f7a7f8f6120fd58ce55eee6afb2289f8cc91f85260d10c"} Oct 01 12:57:27 crc kubenswrapper[4851]: I1001 12:57:27.491816 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mgxt" event={"ID":"70558de8-d877-4f24-a9ee-18f3696799a8","Type":"ContainerStarted","Data":"241ae39a6d567ea58b1f8f7c3fb6151165e7b60b7be14cad2dc5c07856c2b42e"} Oct 01 12:57:27 crc kubenswrapper[4851]: I1001 12:57:27.494963 4851 generic.go:334] "Generic (PLEG): container finished" podID="dd203b3c-a861-4dba-9df9-7328c541c294" containerID="8748275bc06f5a9c8a794a6ab481219cd137f446c1adc70b8e2563cbec3867d9" exitCode=0 Oct 01 12:57:27 crc kubenswrapper[4851]: I1001 12:57:27.494992 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7cggq" event={"ID":"dd203b3c-a861-4dba-9df9-7328c541c294","Type":"ContainerDied","Data":"8748275bc06f5a9c8a794a6ab481219cd137f446c1adc70b8e2563cbec3867d9"} Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.174275 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-knk85"] Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.179602 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-knk85"] Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.179682 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.220745 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.271836 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b535b20d-942b-417e-8580-f29cebe2401f-utilities\") pod \"redhat-marketplace-knk85\" (UID: \"b535b20d-942b-417e-8580-f29cebe2401f\") " pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.271872 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b535b20d-942b-417e-8580-f29cebe2401f-catalog-content\") pod \"redhat-marketplace-knk85\" (UID: \"b535b20d-942b-417e-8580-f29cebe2401f\") " pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.271901 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p62fd\" (UniqueName: \"kubernetes.io/projected/b535b20d-942b-417e-8580-f29cebe2401f-kube-api-access-p62fd\") pod \"redhat-marketplace-knk85\" (UID: \"b535b20d-942b-417e-8580-f29cebe2401f\") " pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.373321 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b535b20d-942b-417e-8580-f29cebe2401f-utilities\") pod \"redhat-marketplace-knk85\" (UID: \"b535b20d-942b-417e-8580-f29cebe2401f\") " pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.373369 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b535b20d-942b-417e-8580-f29cebe2401f-catalog-content\") pod \"redhat-marketplace-knk85\" (UID: \"b535b20d-942b-417e-8580-f29cebe2401f\") " pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.373408 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p62fd\" (UniqueName: \"kubernetes.io/projected/b535b20d-942b-417e-8580-f29cebe2401f-kube-api-access-p62fd\") pod \"redhat-marketplace-knk85\" (UID: \"b535b20d-942b-417e-8580-f29cebe2401f\") " pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.374413 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b535b20d-942b-417e-8580-f29cebe2401f-catalog-content\") pod \"redhat-marketplace-knk85\" (UID: \"b535b20d-942b-417e-8580-f29cebe2401f\") " pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.375479 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b535b20d-942b-417e-8580-f29cebe2401f-utilities\") pod \"redhat-marketplace-knk85\" (UID: \"b535b20d-942b-417e-8580-f29cebe2401f\") " pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.407692 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p62fd\" (UniqueName: \"kubernetes.io/projected/b535b20d-942b-417e-8580-f29cebe2401f-kube-api-access-p62fd\") pod \"redhat-marketplace-knk85\" (UID: \"b535b20d-942b-417e-8580-f29cebe2401f\") " pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.511282 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jgwl2" event={"ID":"60fbd453-e33b-463e-8dc0-e82251bdec0d","Type":"ContainerStarted","Data":"fe408fefc8ff4c1ad3791113262749af497f1049e3651f4c2c63e66d0c6cac2f"} Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.513787 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mgxt" event={"ID":"70558de8-d877-4f24-a9ee-18f3696799a8","Type":"ContainerStarted","Data":"a9fc49c9f768d492891271a41211029e80c7a9126ebafedec9c8ea7540ba5c0f"} Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.515462 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7cggq" event={"ID":"dd203b3c-a861-4dba-9df9-7328c541c294","Type":"ContainerStarted","Data":"27e6c8b9f04e2c3569d8005e4ccaceade26cf36f4b102f3346baf7ac41949437"} Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.533733 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.548724 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7cggq" podStartSLOduration=3.080574809 podStartE2EDuration="5.5487085s" podCreationTimestamp="2025-10-01 12:57:23 +0000 UTC" firstStartedPulling="2025-10-01 12:57:25.47212117 +0000 UTC m=+253.817238706" lastFinishedPulling="2025-10-01 12:57:27.940254871 +0000 UTC m=+256.285372397" observedRunningTime="2025-10-01 12:57:28.545153609 +0000 UTC m=+256.890271095" watchObservedRunningTime="2025-10-01 12:57:28.5487085 +0000 UTC m=+256.893825986" Oct 01 12:57:28 crc kubenswrapper[4851]: I1001 12:57:28.976059 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-knk85"] Oct 01 12:57:29 crc kubenswrapper[4851]: I1001 12:57:29.523440 4851 generic.go:334] "Generic (PLEG): container finished" podID="70558de8-d877-4f24-a9ee-18f3696799a8" containerID="a9fc49c9f768d492891271a41211029e80c7a9126ebafedec9c8ea7540ba5c0f" exitCode=0 Oct 01 12:57:29 crc kubenswrapper[4851]: I1001 12:57:29.523596 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mgxt" event={"ID":"70558de8-d877-4f24-a9ee-18f3696799a8","Type":"ContainerDied","Data":"a9fc49c9f768d492891271a41211029e80c7a9126ebafedec9c8ea7540ba5c0f"} Oct 01 12:57:29 crc kubenswrapper[4851]: I1001 12:57:29.525670 4851 generic.go:334] "Generic (PLEG): container finished" podID="b535b20d-942b-417e-8580-f29cebe2401f" containerID="474840257c2e61bec0c8ca0f9478f185e377bd59dff1618e7606cd52b01f5ce3" exitCode=0 Oct 01 12:57:29 crc kubenswrapper[4851]: I1001 12:57:29.525755 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-knk85" event={"ID":"b535b20d-942b-417e-8580-f29cebe2401f","Type":"ContainerDied","Data":"474840257c2e61bec0c8ca0f9478f185e377bd59dff1618e7606cd52b01f5ce3"} Oct 01 12:57:29 crc kubenswrapper[4851]: I1001 12:57:29.525794 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-knk85" event={"ID":"b535b20d-942b-417e-8580-f29cebe2401f","Type":"ContainerStarted","Data":"d60670b21a8730a06e785c65ad7ede96d36bc543640c1714ab5c3c22c991dc34"} Oct 01 12:57:29 crc kubenswrapper[4851]: I1001 12:57:29.530682 4851 generic.go:334] "Generic (PLEG): container finished" podID="60fbd453-e33b-463e-8dc0-e82251bdec0d" containerID="fe408fefc8ff4c1ad3791113262749af497f1049e3651f4c2c63e66d0c6cac2f" exitCode=0 Oct 01 12:57:29 crc kubenswrapper[4851]: I1001 12:57:29.531627 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jgwl2" event={"ID":"60fbd453-e33b-463e-8dc0-e82251bdec0d","Type":"ContainerDied","Data":"fe408fefc8ff4c1ad3791113262749af497f1049e3651f4c2c63e66d0c6cac2f"} Oct 01 12:57:31 crc kubenswrapper[4851]: I1001 12:57:31.541960 4851 generic.go:334] "Generic (PLEG): container finished" podID="b535b20d-942b-417e-8580-f29cebe2401f" containerID="5641790c6e176f7451d092717fbd86f52eef2dddec3835625165dd1cc0692eb8" exitCode=0 Oct 01 12:57:31 crc kubenswrapper[4851]: I1001 12:57:31.542131 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-knk85" event={"ID":"b535b20d-942b-417e-8580-f29cebe2401f","Type":"ContainerDied","Data":"5641790c6e176f7451d092717fbd86f52eef2dddec3835625165dd1cc0692eb8"} Oct 01 12:57:31 crc kubenswrapper[4851]: I1001 12:57:31.545738 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jgwl2" event={"ID":"60fbd453-e33b-463e-8dc0-e82251bdec0d","Type":"ContainerStarted","Data":"ac4bcfe66996752bf983202ed4f171354d725d7e785a076f46cd9e3c4a9256ad"} Oct 01 12:57:31 crc kubenswrapper[4851]: I1001 12:57:31.548568 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mgxt" event={"ID":"70558de8-d877-4f24-a9ee-18f3696799a8","Type":"ContainerStarted","Data":"fb4153c7fefb12eaf1ab8dc33ac045ed24f0e81553b28254f0449527add1a562"} Oct 01 12:57:31 crc kubenswrapper[4851]: I1001 12:57:31.618660 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9mgxt" podStartSLOduration=3.183123293 podStartE2EDuration="5.618640611s" podCreationTimestamp="2025-10-01 12:57:26 +0000 UTC" firstStartedPulling="2025-10-01 12:57:27.493192238 +0000 UTC m=+255.838309714" lastFinishedPulling="2025-10-01 12:57:29.928709546 +0000 UTC m=+258.273827032" observedRunningTime="2025-10-01 12:57:31.61442014 +0000 UTC m=+259.959537636" watchObservedRunningTime="2025-10-01 12:57:31.618640611 +0000 UTC m=+259.963758117" Oct 01 12:57:31 crc kubenswrapper[4851]: I1001 12:57:31.619256 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jgwl2" podStartSLOduration=4.100802612 podStartE2EDuration="6.619248468s" podCreationTimestamp="2025-10-01 12:57:25 +0000 UTC" firstStartedPulling="2025-10-01 12:57:27.487512276 +0000 UTC m=+255.832629762" lastFinishedPulling="2025-10-01 12:57:30.005958122 +0000 UTC m=+258.351075618" observedRunningTime="2025-10-01 12:57:31.599371301 +0000 UTC m=+259.944488837" watchObservedRunningTime="2025-10-01 12:57:31.619248468 +0000 UTC m=+259.964365964" Oct 01 12:57:32 crc kubenswrapper[4851]: I1001 12:57:32.558144 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-knk85" event={"ID":"b535b20d-942b-417e-8580-f29cebe2401f","Type":"ContainerStarted","Data":"4e42e46c83e0db2c79f6af4bc8a86b4e16c509859afa09764cf4ba1be87b91e0"} Oct 01 12:57:32 crc kubenswrapper[4851]: I1001 12:57:32.579567 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-knk85" podStartSLOduration=1.8777937919999999 podStartE2EDuration="4.579547619s" podCreationTimestamp="2025-10-01 12:57:28 +0000 UTC" firstStartedPulling="2025-10-01 12:57:29.527453011 +0000 UTC m=+257.872570527" lastFinishedPulling="2025-10-01 12:57:32.229206868 +0000 UTC m=+260.574324354" observedRunningTime="2025-10-01 12:57:32.576468692 +0000 UTC m=+260.921586178" watchObservedRunningTime="2025-10-01 12:57:32.579547619 +0000 UTC m=+260.924665125" Oct 01 12:57:34 crc kubenswrapper[4851]: I1001 12:57:34.290523 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:34 crc kubenswrapper[4851]: I1001 12:57:34.290601 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:34 crc kubenswrapper[4851]: I1001 12:57:34.365729 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:34 crc kubenswrapper[4851]: I1001 12:57:34.632795 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 12:57:36 crc kubenswrapper[4851]: I1001 12:57:36.155553 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:36 crc kubenswrapper[4851]: I1001 12:57:36.155950 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:36 crc kubenswrapper[4851]: I1001 12:57:36.208753 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:36 crc kubenswrapper[4851]: I1001 12:57:36.633826 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jgwl2" Oct 01 12:57:36 crc kubenswrapper[4851]: I1001 12:57:36.692100 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:36 crc kubenswrapper[4851]: I1001 12:57:36.692469 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:36 crc kubenswrapper[4851]: I1001 12:57:36.735970 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:37 crc kubenswrapper[4851]: I1001 12:57:37.627740 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9mgxt" Oct 01 12:57:38 crc kubenswrapper[4851]: I1001 12:57:38.534678 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:57:38 crc kubenswrapper[4851]: I1001 12:57:38.535090 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:57:38 crc kubenswrapper[4851]: I1001 12:57:38.584326 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:57:38 crc kubenswrapper[4851]: I1001 12:57:38.643742 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-knk85" Oct 01 12:59:00 crc kubenswrapper[4851]: I1001 12:59:00.050551 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:59:00 crc kubenswrapper[4851]: I1001 12:59:00.051089 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:59:30 crc kubenswrapper[4851]: I1001 12:59:30.051101 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:59:30 crc kubenswrapper[4851]: I1001 12:59:30.052860 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.050781 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.051661 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.051732 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.053221 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c0cb68eadfaa5fa243dbc130588998253b08c77917d93919066ac304838046a9"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.053358 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://c0cb68eadfaa5fa243dbc130588998253b08c77917d93919066ac304838046a9" gracePeriod=600 Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.159334 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r"] Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.160829 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.169175 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.169815 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.189275 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r"] Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.249295 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee89af1b-9cae-40d0-a6a4-555b63d3e419-config-volume\") pod \"collect-profiles-29322060-nt85r\" (UID: \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.249369 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee89af1b-9cae-40d0-a6a4-555b63d3e419-secret-volume\") pod \"collect-profiles-29322060-nt85r\" (UID: \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.249443 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv857\" (UniqueName: \"kubernetes.io/projected/ee89af1b-9cae-40d0-a6a4-555b63d3e419-kube-api-access-qv857\") pod \"collect-profiles-29322060-nt85r\" (UID: \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.350735 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee89af1b-9cae-40d0-a6a4-555b63d3e419-config-volume\") pod \"collect-profiles-29322060-nt85r\" (UID: \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.350824 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee89af1b-9cae-40d0-a6a4-555b63d3e419-secret-volume\") pod \"collect-profiles-29322060-nt85r\" (UID: \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.350911 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv857\" (UniqueName: \"kubernetes.io/projected/ee89af1b-9cae-40d0-a6a4-555b63d3e419-kube-api-access-qv857\") pod \"collect-profiles-29322060-nt85r\" (UID: \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.352074 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee89af1b-9cae-40d0-a6a4-555b63d3e419-config-volume\") pod \"collect-profiles-29322060-nt85r\" (UID: \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.362311 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee89af1b-9cae-40d0-a6a4-555b63d3e419-secret-volume\") pod \"collect-profiles-29322060-nt85r\" (UID: \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.386219 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv857\" (UniqueName: \"kubernetes.io/projected/ee89af1b-9cae-40d0-a6a4-555b63d3e419-kube-api-access-qv857\") pod \"collect-profiles-29322060-nt85r\" (UID: \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.527740 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.552164 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="c0cb68eadfaa5fa243dbc130588998253b08c77917d93919066ac304838046a9" exitCode=0 Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.552307 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"c0cb68eadfaa5fa243dbc130588998253b08c77917d93919066ac304838046a9"} Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.553059 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"df5480c0ce0978a756a190c53fc2c8d0701f9def62fec11373bfd7467e9dc90a"} Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.553115 4851 scope.go:117] "RemoveContainer" containerID="10eeaf2582a1b597b86717bcecde16a898692b71f2aa6d33f472f681feaaaab6" Oct 01 13:00:00 crc kubenswrapper[4851]: I1001 13:00:00.824909 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r"] Oct 01 13:00:01 crc kubenswrapper[4851]: I1001 13:00:01.563922 4851 generic.go:334] "Generic (PLEG): container finished" podID="ee89af1b-9cae-40d0-a6a4-555b63d3e419" containerID="b2a7aa1574ac7509dc8dc51b4fd2502f4a8130f2c3f33baa41b5fb2f9152991d" exitCode=0 Oct 01 13:00:01 crc kubenswrapper[4851]: I1001 13:00:01.564006 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" event={"ID":"ee89af1b-9cae-40d0-a6a4-555b63d3e419","Type":"ContainerDied","Data":"b2a7aa1574ac7509dc8dc51b4fd2502f4a8130f2c3f33baa41b5fb2f9152991d"} Oct 01 13:00:01 crc kubenswrapper[4851]: I1001 13:00:01.564615 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" event={"ID":"ee89af1b-9cae-40d0-a6a4-555b63d3e419","Type":"ContainerStarted","Data":"8b00fddc034bbccfa1e328ad4e2c7a306d63438fc47949242b227340b2ad4181"} Oct 01 13:00:02 crc kubenswrapper[4851]: I1001 13:00:02.828675 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" Oct 01 13:00:02 crc kubenswrapper[4851]: I1001 13:00:02.986763 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee89af1b-9cae-40d0-a6a4-555b63d3e419-secret-volume\") pod \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\" (UID: \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\") " Oct 01 13:00:02 crc kubenswrapper[4851]: I1001 13:00:02.986843 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee89af1b-9cae-40d0-a6a4-555b63d3e419-config-volume\") pod \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\" (UID: \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\") " Oct 01 13:00:02 crc kubenswrapper[4851]: I1001 13:00:02.986880 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv857\" (UniqueName: \"kubernetes.io/projected/ee89af1b-9cae-40d0-a6a4-555b63d3e419-kube-api-access-qv857\") pod \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\" (UID: \"ee89af1b-9cae-40d0-a6a4-555b63d3e419\") " Oct 01 13:00:02 crc kubenswrapper[4851]: I1001 13:00:02.987756 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee89af1b-9cae-40d0-a6a4-555b63d3e419-config-volume" (OuterVolumeSpecName: "config-volume") pod "ee89af1b-9cae-40d0-a6a4-555b63d3e419" (UID: "ee89af1b-9cae-40d0-a6a4-555b63d3e419"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:00:02 crc kubenswrapper[4851]: I1001 13:00:02.995015 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee89af1b-9cae-40d0-a6a4-555b63d3e419-kube-api-access-qv857" (OuterVolumeSpecName: "kube-api-access-qv857") pod "ee89af1b-9cae-40d0-a6a4-555b63d3e419" (UID: "ee89af1b-9cae-40d0-a6a4-555b63d3e419"). InnerVolumeSpecName "kube-api-access-qv857". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:00:02 crc kubenswrapper[4851]: I1001 13:00:02.995283 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee89af1b-9cae-40d0-a6a4-555b63d3e419-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ee89af1b-9cae-40d0-a6a4-555b63d3e419" (UID: "ee89af1b-9cae-40d0-a6a4-555b63d3e419"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:00:03 crc kubenswrapper[4851]: I1001 13:00:03.088542 4851 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee89af1b-9cae-40d0-a6a4-555b63d3e419-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:00:03 crc kubenswrapper[4851]: I1001 13:00:03.088587 4851 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee89af1b-9cae-40d0-a6a4-555b63d3e419-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:00:03 crc kubenswrapper[4851]: I1001 13:00:03.088607 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv857\" (UniqueName: \"kubernetes.io/projected/ee89af1b-9cae-40d0-a6a4-555b63d3e419-kube-api-access-qv857\") on node \"crc\" DevicePath \"\"" Oct 01 13:00:03 crc kubenswrapper[4851]: I1001 13:00:03.582748 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" event={"ID":"ee89af1b-9cae-40d0-a6a4-555b63d3e419","Type":"ContainerDied","Data":"8b00fddc034bbccfa1e328ad4e2c7a306d63438fc47949242b227340b2ad4181"} Oct 01 13:00:03 crc kubenswrapper[4851]: I1001 13:00:03.582796 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r" Oct 01 13:00:03 crc kubenswrapper[4851]: I1001 13:00:03.582806 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b00fddc034bbccfa1e328ad4e2c7a306d63438fc47949242b227340b2ad4181" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.491706 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9mt9c"] Oct 01 13:00:16 crc kubenswrapper[4851]: E1001 13:00:16.492773 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee89af1b-9cae-40d0-a6a4-555b63d3e419" containerName="collect-profiles" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.492794 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee89af1b-9cae-40d0-a6a4-555b63d3e419" containerName="collect-profiles" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.492948 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee89af1b-9cae-40d0-a6a4-555b63d3e419" containerName="collect-profiles" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.493534 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.520624 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9mt9c"] Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.689171 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.689229 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldq86\" (UniqueName: \"kubernetes.io/projected/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-kube-api-access-ldq86\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.689275 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-registry-tls\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.689313 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.689336 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-trusted-ca\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.689362 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-bound-sa-token\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.689391 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-registry-certificates\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.689431 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.718834 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.790809 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldq86\" (UniqueName: \"kubernetes.io/projected/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-kube-api-access-ldq86\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.790872 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-registry-tls\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.790917 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.790944 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-trusted-ca\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.790968 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-bound-sa-token\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.790998 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-registry-certificates\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.791041 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.792313 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.793184 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-registry-certificates\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.793676 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-trusted-ca\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.799090 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.799877 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-registry-tls\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.821599 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-bound-sa-token\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:16 crc kubenswrapper[4851]: I1001 13:00:16.822108 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldq86\" (UniqueName: \"kubernetes.io/projected/e452bffe-f75f-450e-b6d8-ddf456d6f5ad-kube-api-access-ldq86\") pod \"image-registry-66df7c8f76-9mt9c\" (UID: \"e452bffe-f75f-450e-b6d8-ddf456d6f5ad\") " pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:17 crc kubenswrapper[4851]: I1001 13:00:17.117267 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:17 crc kubenswrapper[4851]: I1001 13:00:17.396579 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9mt9c"] Oct 01 13:00:17 crc kubenswrapper[4851]: W1001 13:00:17.403008 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode452bffe_f75f_450e_b6d8_ddf456d6f5ad.slice/crio-012bcb286ec9aa0e5e0ae0a3c92826b5f38cf62cb53c479c21e43f81533d82d0 WatchSource:0}: Error finding container 012bcb286ec9aa0e5e0ae0a3c92826b5f38cf62cb53c479c21e43f81533d82d0: Status 404 returned error can't find the container with id 012bcb286ec9aa0e5e0ae0a3c92826b5f38cf62cb53c479c21e43f81533d82d0 Oct 01 13:00:17 crc kubenswrapper[4851]: I1001 13:00:17.677455 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" event={"ID":"e452bffe-f75f-450e-b6d8-ddf456d6f5ad","Type":"ContainerStarted","Data":"a5fc07e6769c9033ffe0ff280b58ff193e0534c7fffdf15abc921e7ed23407cc"} Oct 01 13:00:17 crc kubenswrapper[4851]: I1001 13:00:17.678053 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:17 crc kubenswrapper[4851]: I1001 13:00:17.678078 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" event={"ID":"e452bffe-f75f-450e-b6d8-ddf456d6f5ad","Type":"ContainerStarted","Data":"012bcb286ec9aa0e5e0ae0a3c92826b5f38cf62cb53c479c21e43f81533d82d0"} Oct 01 13:00:37 crc kubenswrapper[4851]: I1001 13:00:37.128535 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" Oct 01 13:00:37 crc kubenswrapper[4851]: I1001 13:00:37.159688 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-9mt9c" podStartSLOduration=21.159658915 podStartE2EDuration="21.159658915s" podCreationTimestamp="2025-10-01 13:00:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:00:17.715689118 +0000 UTC m=+426.060806624" watchObservedRunningTime="2025-10-01 13:00:37.159658915 +0000 UTC m=+445.504776471" Oct 01 13:00:37 crc kubenswrapper[4851]: I1001 13:00:37.204849 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6s9sc"] Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.269367 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" podUID="a54d2004-e4e0-4b2c-b026-e82a6f241da7" containerName="registry" containerID="cri-o://2e277d222ffa15dacb49fb9a061fb45830e0d13f76753162da71703d9e045a9b" gracePeriod=30 Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.655160 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.713711 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-registry-tls\") pod \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.713793 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a54d2004-e4e0-4b2c-b026-e82a6f241da7-trusted-ca\") pod \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.713840 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-bound-sa-token\") pod \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.713886 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a54d2004-e4e0-4b2c-b026-e82a6f241da7-installation-pull-secrets\") pod \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.713926 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a54d2004-e4e0-4b2c-b026-e82a6f241da7-registry-certificates\") pod \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.713965 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a54d2004-e4e0-4b2c-b026-e82a6f241da7-ca-trust-extracted\") pod \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.714232 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.714341 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs8cj\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-kube-api-access-fs8cj\") pod \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\" (UID: \"a54d2004-e4e0-4b2c-b026-e82a6f241da7\") " Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.714515 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a54d2004-e4e0-4b2c-b026-e82a6f241da7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a54d2004-e4e0-4b2c-b026-e82a6f241da7" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.714806 4851 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a54d2004-e4e0-4b2c-b026-e82a6f241da7-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.715355 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a54d2004-e4e0-4b2c-b026-e82a6f241da7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a54d2004-e4e0-4b2c-b026-e82a6f241da7" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.719818 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a54d2004-e4e0-4b2c-b026-e82a6f241da7" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.720128 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-kube-api-access-fs8cj" (OuterVolumeSpecName: "kube-api-access-fs8cj") pod "a54d2004-e4e0-4b2c-b026-e82a6f241da7" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7"). InnerVolumeSpecName "kube-api-access-fs8cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.723027 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a54d2004-e4e0-4b2c-b026-e82a6f241da7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a54d2004-e4e0-4b2c-b026-e82a6f241da7" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.723301 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a54d2004-e4e0-4b2c-b026-e82a6f241da7" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.735591 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a54d2004-e4e0-4b2c-b026-e82a6f241da7" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.744889 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54d2004-e4e0-4b2c-b026-e82a6f241da7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a54d2004-e4e0-4b2c-b026-e82a6f241da7" (UID: "a54d2004-e4e0-4b2c-b026-e82a6f241da7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.816103 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs8cj\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-kube-api-access-fs8cj\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.816173 4851 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.816204 4851 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a54d2004-e4e0-4b2c-b026-e82a6f241da7-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.816225 4851 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a54d2004-e4e0-4b2c-b026-e82a6f241da7-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.816246 4851 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a54d2004-e4e0-4b2c-b026-e82a6f241da7-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.816264 4851 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a54d2004-e4e0-4b2c-b026-e82a6f241da7-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.988230 4851 generic.go:334] "Generic (PLEG): container finished" podID="a54d2004-e4e0-4b2c-b026-e82a6f241da7" containerID="2e277d222ffa15dacb49fb9a061fb45830e0d13f76753162da71703d9e045a9b" exitCode=0 Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.988290 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.988289 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" event={"ID":"a54d2004-e4e0-4b2c-b026-e82a6f241da7","Type":"ContainerDied","Data":"2e277d222ffa15dacb49fb9a061fb45830e0d13f76753162da71703d9e045a9b"} Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.988355 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6s9sc" event={"ID":"a54d2004-e4e0-4b2c-b026-e82a6f241da7","Type":"ContainerDied","Data":"3b07dcc93d3937aa9db3944a9750d8ca78685d29d82fe3027fdec3d58d0fcdc1"} Oct 01 13:01:02 crc kubenswrapper[4851]: I1001 13:01:02.988375 4851 scope.go:117] "RemoveContainer" containerID="2e277d222ffa15dacb49fb9a061fb45830e0d13f76753162da71703d9e045a9b" Oct 01 13:01:03 crc kubenswrapper[4851]: I1001 13:01:03.004995 4851 scope.go:117] "RemoveContainer" containerID="2e277d222ffa15dacb49fb9a061fb45830e0d13f76753162da71703d9e045a9b" Oct 01 13:01:03 crc kubenswrapper[4851]: E1001 13:01:03.005403 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e277d222ffa15dacb49fb9a061fb45830e0d13f76753162da71703d9e045a9b\": container with ID starting with 2e277d222ffa15dacb49fb9a061fb45830e0d13f76753162da71703d9e045a9b not found: ID does not exist" containerID="2e277d222ffa15dacb49fb9a061fb45830e0d13f76753162da71703d9e045a9b" Oct 01 13:01:03 crc kubenswrapper[4851]: I1001 13:01:03.005473 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e277d222ffa15dacb49fb9a061fb45830e0d13f76753162da71703d9e045a9b"} err="failed to get container status \"2e277d222ffa15dacb49fb9a061fb45830e0d13f76753162da71703d9e045a9b\": rpc error: code = NotFound desc = could not find container \"2e277d222ffa15dacb49fb9a061fb45830e0d13f76753162da71703d9e045a9b\": container with ID starting with 2e277d222ffa15dacb49fb9a061fb45830e0d13f76753162da71703d9e045a9b not found: ID does not exist" Oct 01 13:01:03 crc kubenswrapper[4851]: I1001 13:01:03.011281 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6s9sc"] Oct 01 13:01:03 crc kubenswrapper[4851]: I1001 13:01:03.014312 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6s9sc"] Oct 01 13:01:04 crc kubenswrapper[4851]: I1001 13:01:04.343770 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a54d2004-e4e0-4b2c-b026-e82a6f241da7" path="/var/lib/kubelet/pods/a54d2004-e4e0-4b2c-b026-e82a6f241da7/volumes" Oct 01 13:02:00 crc kubenswrapper[4851]: I1001 13:02:00.050273 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:02:00 crc kubenswrapper[4851]: I1001 13:02:00.051145 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:02:30 crc kubenswrapper[4851]: I1001 13:02:30.050433 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:02:30 crc kubenswrapper[4851]: I1001 13:02:30.051195 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.131366 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-d6clp"] Oct 01 13:02:56 crc kubenswrapper[4851]: E1001 13:02:56.132333 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54d2004-e4e0-4b2c-b026-e82a6f241da7" containerName="registry" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.132356 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54d2004-e4e0-4b2c-b026-e82a6f241da7" containerName="registry" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.132571 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54d2004-e4e0-4b2c-b026-e82a6f241da7" containerName="registry" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.133132 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-d6clp" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.135617 4851 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-k8hrq" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.136229 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.136375 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-26dms"] Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.136998 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-26dms" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.145661 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-d6clp"] Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.149956 4851 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pl77s" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.150737 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-26dms"] Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.155812 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.174240 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dhkqx"] Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.174825 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-dhkqx" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.184057 4851 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qq9mn" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.184317 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkgj9\" (UniqueName: \"kubernetes.io/projected/097254a7-5e3a-4d5b-90d5-f6f1ae0b4c3d-kube-api-access-hkgj9\") pod \"cert-manager-webhook-5655c58dd6-dhkqx\" (UID: \"097254a7-5e3a-4d5b-90d5-f6f1ae0b4c3d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dhkqx" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.184368 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m4c9\" (UniqueName: \"kubernetes.io/projected/435110f0-cfd8-423b-afdc-a9cfe0426a3a-kube-api-access-9m4c9\") pod \"cert-manager-5b446d88c5-26dms\" (UID: \"435110f0-cfd8-423b-afdc-a9cfe0426a3a\") " pod="cert-manager/cert-manager-5b446d88c5-26dms" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.184414 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqsl8\" (UniqueName: \"kubernetes.io/projected/f24da60a-9cbf-482f-a764-43145245be50-kube-api-access-vqsl8\") pod \"cert-manager-cainjector-7f985d654d-d6clp\" (UID: \"f24da60a-9cbf-482f-a764-43145245be50\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-d6clp" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.192691 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dhkqx"] Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.285332 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m4c9\" (UniqueName: \"kubernetes.io/projected/435110f0-cfd8-423b-afdc-a9cfe0426a3a-kube-api-access-9m4c9\") pod \"cert-manager-5b446d88c5-26dms\" (UID: \"435110f0-cfd8-423b-afdc-a9cfe0426a3a\") " pod="cert-manager/cert-manager-5b446d88c5-26dms" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.285396 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqsl8\" (UniqueName: \"kubernetes.io/projected/f24da60a-9cbf-482f-a764-43145245be50-kube-api-access-vqsl8\") pod \"cert-manager-cainjector-7f985d654d-d6clp\" (UID: \"f24da60a-9cbf-482f-a764-43145245be50\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-d6clp" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.285434 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkgj9\" (UniqueName: \"kubernetes.io/projected/097254a7-5e3a-4d5b-90d5-f6f1ae0b4c3d-kube-api-access-hkgj9\") pod \"cert-manager-webhook-5655c58dd6-dhkqx\" (UID: \"097254a7-5e3a-4d5b-90d5-f6f1ae0b4c3d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dhkqx" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.303366 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkgj9\" (UniqueName: \"kubernetes.io/projected/097254a7-5e3a-4d5b-90d5-f6f1ae0b4c3d-kube-api-access-hkgj9\") pod \"cert-manager-webhook-5655c58dd6-dhkqx\" (UID: \"097254a7-5e3a-4d5b-90d5-f6f1ae0b4c3d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dhkqx" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.306032 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqsl8\" (UniqueName: \"kubernetes.io/projected/f24da60a-9cbf-482f-a764-43145245be50-kube-api-access-vqsl8\") pod \"cert-manager-cainjector-7f985d654d-d6clp\" (UID: \"f24da60a-9cbf-482f-a764-43145245be50\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-d6clp" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.308683 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m4c9\" (UniqueName: \"kubernetes.io/projected/435110f0-cfd8-423b-afdc-a9cfe0426a3a-kube-api-access-9m4c9\") pod \"cert-manager-5b446d88c5-26dms\" (UID: \"435110f0-cfd8-423b-afdc-a9cfe0426a3a\") " pod="cert-manager/cert-manager-5b446d88c5-26dms" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.455989 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-d6clp" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.474067 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-26dms" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.501200 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-dhkqx" Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.772467 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-26dms"] Oct 01 13:02:56 crc kubenswrapper[4851]: I1001 13:02:56.782656 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:02:57 crc kubenswrapper[4851]: I1001 13:02:57.037856 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dhkqx"] Oct 01 13:02:57 crc kubenswrapper[4851]: I1001 13:02:57.050394 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-d6clp"] Oct 01 13:02:57 crc kubenswrapper[4851]: W1001 13:02:57.052628 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod097254a7_5e3a_4d5b_90d5_f6f1ae0b4c3d.slice/crio-377165c009f72d4df3f722d7121a39593dfada005228afff002ab45be70478d2 WatchSource:0}: Error finding container 377165c009f72d4df3f722d7121a39593dfada005228afff002ab45be70478d2: Status 404 returned error can't find the container with id 377165c009f72d4df3f722d7121a39593dfada005228afff002ab45be70478d2 Oct 01 13:02:57 crc kubenswrapper[4851]: I1001 13:02:57.731923 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-26dms" event={"ID":"435110f0-cfd8-423b-afdc-a9cfe0426a3a","Type":"ContainerStarted","Data":"8d85a7a15e9e36c67f9ef13be52d6200a0da6c5c8760372539f4cfec8cd5f64a"} Oct 01 13:02:57 crc kubenswrapper[4851]: I1001 13:02:57.734592 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-d6clp" event={"ID":"f24da60a-9cbf-482f-a764-43145245be50","Type":"ContainerStarted","Data":"9e3660f0246d7faef8865397d45a9cd6356d037bda39185cd3b8ca9e9206421f"} Oct 01 13:02:57 crc kubenswrapper[4851]: I1001 13:02:57.736185 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-dhkqx" event={"ID":"097254a7-5e3a-4d5b-90d5-f6f1ae0b4c3d","Type":"ContainerStarted","Data":"377165c009f72d4df3f722d7121a39593dfada005228afff002ab45be70478d2"} Oct 01 13:02:59 crc kubenswrapper[4851]: I1001 13:02:59.746680 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-d6clp" event={"ID":"f24da60a-9cbf-482f-a764-43145245be50","Type":"ContainerStarted","Data":"ad6348015843e3fbd85d4fc73f07786902ce4974370690bddf720e650be9b35b"} Oct 01 13:02:59 crc kubenswrapper[4851]: I1001 13:02:59.748204 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-26dms" event={"ID":"435110f0-cfd8-423b-afdc-a9cfe0426a3a","Type":"ContainerStarted","Data":"95252174fe806216df608ce58a72b9ea77b7e5429345fcd71ee9bd30599a5909"} Oct 01 13:02:59 crc kubenswrapper[4851]: I1001 13:02:59.768454 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-d6clp" podStartSLOduration=1.5021855720000001 podStartE2EDuration="3.768438077s" podCreationTimestamp="2025-10-01 13:02:56 +0000 UTC" firstStartedPulling="2025-10-01 13:02:57.065697564 +0000 UTC m=+585.410815080" lastFinishedPulling="2025-10-01 13:02:59.331950099 +0000 UTC m=+587.677067585" observedRunningTime="2025-10-01 13:02:59.765378668 +0000 UTC m=+588.110496154" watchObservedRunningTime="2025-10-01 13:02:59.768438077 +0000 UTC m=+588.113555563" Oct 01 13:03:00 crc kubenswrapper[4851]: I1001 13:03:00.050082 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:03:00 crc kubenswrapper[4851]: I1001 13:03:00.050149 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:03:00 crc kubenswrapper[4851]: I1001 13:03:00.050200 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 13:03:00 crc kubenswrapper[4851]: I1001 13:03:00.051203 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"df5480c0ce0978a756a190c53fc2c8d0701f9def62fec11373bfd7467e9dc90a"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:03:00 crc kubenswrapper[4851]: I1001 13:03:00.051543 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://df5480c0ce0978a756a190c53fc2c8d0701f9def62fec11373bfd7467e9dc90a" gracePeriod=600 Oct 01 13:03:00 crc kubenswrapper[4851]: I1001 13:03:00.756888 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="df5480c0ce0978a756a190c53fc2c8d0701f9def62fec11373bfd7467e9dc90a" exitCode=0 Oct 01 13:03:00 crc kubenswrapper[4851]: I1001 13:03:00.756987 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"df5480c0ce0978a756a190c53fc2c8d0701f9def62fec11373bfd7467e9dc90a"} Oct 01 13:03:00 crc kubenswrapper[4851]: I1001 13:03:00.757127 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"288531af0e115f9595ac6f6f759c2572ba4e5c19461b4094fb567dd41bccf2dd"} Oct 01 13:03:00 crc kubenswrapper[4851]: I1001 13:03:00.757151 4851 scope.go:117] "RemoveContainer" containerID="c0cb68eadfaa5fa243dbc130588998253b08c77917d93919066ac304838046a9" Oct 01 13:03:00 crc kubenswrapper[4851]: I1001 13:03:00.759419 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-dhkqx" event={"ID":"097254a7-5e3a-4d5b-90d5-f6f1ae0b4c3d","Type":"ContainerStarted","Data":"a4200f9b9ba86dd5a887bbee840d85912a1f0e84a6bb505236a203b2487c317e"} Oct 01 13:03:00 crc kubenswrapper[4851]: I1001 13:03:00.759738 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-dhkqx" Oct 01 13:03:00 crc kubenswrapper[4851]: I1001 13:03:00.781130 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-26dms" podStartSLOduration=2.271936936 podStartE2EDuration="4.781106909s" podCreationTimestamp="2025-10-01 13:02:56 +0000 UTC" firstStartedPulling="2025-10-01 13:02:56.782448617 +0000 UTC m=+585.127566103" lastFinishedPulling="2025-10-01 13:02:59.29161859 +0000 UTC m=+587.636736076" observedRunningTime="2025-10-01 13:02:59.791915167 +0000 UTC m=+588.137032653" watchObservedRunningTime="2025-10-01 13:03:00.781106909 +0000 UTC m=+589.126224405" Oct 01 13:03:00 crc kubenswrapper[4851]: I1001 13:03:00.793081 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-dhkqx" podStartSLOduration=1.626964788 podStartE2EDuration="4.793064706s" podCreationTimestamp="2025-10-01 13:02:56 +0000 UTC" firstStartedPulling="2025-10-01 13:02:57.068750733 +0000 UTC m=+585.413868229" lastFinishedPulling="2025-10-01 13:03:00.234850661 +0000 UTC m=+588.579968147" observedRunningTime="2025-10-01 13:03:00.79149155 +0000 UTC m=+589.136609066" watchObservedRunningTime="2025-10-01 13:03:00.793064706 +0000 UTC m=+589.138182202" Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.506382 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-dhkqx" Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.622549 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-s78wn"] Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.629496 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovn-controller" containerID="cri-o://96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b" gracePeriod=30 Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.629579 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="nbdb" containerID="cri-o://f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9" gracePeriod=30 Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.629633 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="northd" containerID="cri-o://d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5" gracePeriod=30 Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.629648 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f" gracePeriod=30 Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.629703 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="kube-rbac-proxy-node" containerID="cri-o://6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124" gracePeriod=30 Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.629777 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovn-acl-logging" containerID="cri-o://e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a" gracePeriod=30 Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.629882 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="sbdb" containerID="cri-o://e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69" gracePeriod=30 Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.680863 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" containerID="cri-o://726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5" gracePeriod=30 Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.809435 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovnkube-controller/3.log" Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.813309 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovn-acl-logging/0.log" Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.813990 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovn-controller/0.log" Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.816706 4851 generic.go:334] "Generic (PLEG): container finished" podID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerID="726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5" exitCode=0 Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.816743 4851 generic.go:334] "Generic (PLEG): container finished" podID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerID="0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f" exitCode=0 Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.816757 4851 generic.go:334] "Generic (PLEG): container finished" podID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerID="6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124" exitCode=0 Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.816769 4851 generic.go:334] "Generic (PLEG): container finished" podID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerID="e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a" exitCode=143 Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.816779 4851 generic.go:334] "Generic (PLEG): container finished" podID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerID="96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b" exitCode=143 Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.816746 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerDied","Data":"726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5"} Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.816869 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerDied","Data":"0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f"} Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.816883 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerDied","Data":"6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124"} Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.816898 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerDied","Data":"e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a"} Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.816912 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerDied","Data":"96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b"} Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.816933 4851 scope.go:117] "RemoveContainer" containerID="a1b6af2f6ac4497d8c14ffd58bab6752f79d0bda96bd501432860664e60954d4" Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.819968 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t5vvf_f68f162a-4e04-41d2-8197-95bac24aad23/kube-multus/2.log" Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.820516 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t5vvf_f68f162a-4e04-41d2-8197-95bac24aad23/kube-multus/1.log" Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.820571 4851 generic.go:334] "Generic (PLEG): container finished" podID="f68f162a-4e04-41d2-8197-95bac24aad23" containerID="67fb38d621be1a818ac5d4d426ab4652a41b195832d2e760e8388c776796f58c" exitCode=2 Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.820616 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t5vvf" event={"ID":"f68f162a-4e04-41d2-8197-95bac24aad23","Type":"ContainerDied","Data":"67fb38d621be1a818ac5d4d426ab4652a41b195832d2e760e8388c776796f58c"} Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.821464 4851 scope.go:117] "RemoveContainer" containerID="67fb38d621be1a818ac5d4d426ab4652a41b195832d2e760e8388c776796f58c" Oct 01 13:03:06 crc kubenswrapper[4851]: E1001 13:03:06.821788 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-t5vvf_openshift-multus(f68f162a-4e04-41d2-8197-95bac24aad23)\"" pod="openshift-multus/multus-t5vvf" podUID="f68f162a-4e04-41d2-8197-95bac24aad23" Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.865180 4851 scope.go:117] "RemoveContainer" containerID="9be4185c6a9a5112475d68907478d108366f2f6bff109e7d802e8b720953ffd9" Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.976904 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovn-acl-logging/0.log" Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.977639 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovn-controller/0.log" Oct 01 13:03:06 crc kubenswrapper[4851]: I1001 13:03:06.978099 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030074 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ktkj4"] Oct 01 13:03:07 crc kubenswrapper[4851]: E1001 13:03:07.030257 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="kube-rbac-proxy-node" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030268 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="kube-rbac-proxy-node" Oct 01 13:03:07 crc kubenswrapper[4851]: E1001 13:03:07.030281 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="sbdb" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030287 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="sbdb" Oct 01 13:03:07 crc kubenswrapper[4851]: E1001 13:03:07.030295 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="nbdb" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030301 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="nbdb" Oct 01 13:03:07 crc kubenswrapper[4851]: E1001 13:03:07.030309 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="kubecfg-setup" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030315 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="kubecfg-setup" Oct 01 13:03:07 crc kubenswrapper[4851]: E1001 13:03:07.030326 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovn-acl-logging" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030332 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovn-acl-logging" Oct 01 13:03:07 crc kubenswrapper[4851]: E1001 13:03:07.030339 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030344 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: E1001 13:03:07.030352 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030357 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: E1001 13:03:07.030364 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030370 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: E1001 13:03:07.030380 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030388 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: E1001 13:03:07.030396 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovn-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030402 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovn-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: E1001 13:03:07.030411 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030416 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: E1001 13:03:07.030425 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030431 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 13:03:07 crc kubenswrapper[4851]: E1001 13:03:07.030438 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="northd" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030458 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="northd" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030572 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovn-acl-logging" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030582 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="sbdb" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030590 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030596 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030603 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030614 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030621 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovn-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030629 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="kube-rbac-proxy-node" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030637 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="nbdb" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030644 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="northd" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030795 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.030805 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerName="ovnkube-controller" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.032201 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041265 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-slash\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041416 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041357 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-slash" (OuterVolumeSpecName: "host-slash") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041469 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041571 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-openvswitch\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041595 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-cni-netd\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041639 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041677 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-run-netns\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041724 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041740 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041701 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-systemd-units\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041827 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovn-node-metrics-cert\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041844 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041851 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-systemd\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041925 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-env-overrides\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041943 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-node-log\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041959 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovnkube-config\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.041980 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-run-ovn-kubernetes\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042008 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9jss\" (UniqueName: \"kubernetes.io/projected/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-kube-api-access-l9jss\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042047 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-ovn\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042069 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovnkube-script-lib\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042085 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-log-socket\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042100 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-kubelet\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042129 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-var-lib-openvswitch\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042148 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-etc-openvswitch\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042163 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-cni-bin\") pod \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\" (UID: \"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9\") " Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042458 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-run-ovn-kubernetes\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042494 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-var-lib-openvswitch\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042536 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-ovn-node-metrics-cert\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042566 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-etc-openvswitch\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042584 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-ovnkube-script-lib\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042602 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042623 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-ovnkube-config\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042651 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lmvf\" (UniqueName: \"kubernetes.io/projected/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-kube-api-access-2lmvf\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042668 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-run-netns\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042672 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042670 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042703 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042726 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-log-socket" (OuterVolumeSpecName: "log-socket") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042693 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-cni-netd\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042749 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042767 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042785 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042802 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042815 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-kubelet\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042855 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-node-log\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042878 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-run-ovn\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042884 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-node-log" (OuterVolumeSpecName: "node-log") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042906 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-slash\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042927 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.042989 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043041 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-env-overrides\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043107 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-systemd-units\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043137 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-cni-bin\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043175 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-run-openvswitch\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043246 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-log-socket\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043336 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-run-systemd\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043427 4851 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043452 4851 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043471 4851 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-log-socket\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043490 4851 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043539 4851 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043557 4851 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043575 4851 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043599 4851 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-slash\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043626 4851 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043654 4851 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043677 4851 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043699 4851 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043720 4851 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043742 4851 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-node-log\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043763 4851 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043787 4851 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.043809 4851 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.048543 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-kube-api-access-l9jss" (OuterVolumeSpecName: "kube-api-access-l9jss") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "kube-api-access-l9jss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.049039 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.056309 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" (UID: "eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145007 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-systemd-units\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145055 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-cni-bin\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145080 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-run-openvswitch\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145105 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-log-socket\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145115 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-systemd-units\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145134 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-cni-bin\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145165 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-run-systemd\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145126 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-run-systemd\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145220 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-log-socket\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145184 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-run-openvswitch\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145307 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-run-ovn-kubernetes\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145351 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-run-ovn-kubernetes\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145358 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-var-lib-openvswitch\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145383 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-var-lib-openvswitch\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145417 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-ovn-node-metrics-cert\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145451 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-etc-openvswitch\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145559 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-ovnkube-script-lib\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145634 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145650 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-etc-openvswitch\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145672 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-ovnkube-config\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145718 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lmvf\" (UniqueName: \"kubernetes.io/projected/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-kube-api-access-2lmvf\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145720 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145739 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-run-netns\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145784 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-cni-netd\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145803 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-kubelet\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145820 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-node-log\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145859 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-run-ovn\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145878 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-slash\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145899 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-env-overrides\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145914 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-cni-netd\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145934 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-node-log\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145931 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-run-netns\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.145995 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-kubelet\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.146021 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-host-slash\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.146030 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-run-ovn\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.146320 4851 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.146362 4851 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.146436 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9jss\" (UniqueName: \"kubernetes.io/projected/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9-kube-api-access-l9jss\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.146878 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-env-overrides\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.146964 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-ovnkube-config\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.147016 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-ovnkube-script-lib\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.148671 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-ovn-node-metrics-cert\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.175965 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lmvf\" (UniqueName: \"kubernetes.io/projected/6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0-kube-api-access-2lmvf\") pod \"ovnkube-node-ktkj4\" (UID: \"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.354978 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.834995 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovn-acl-logging/0.log" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.835760 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s78wn_eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/ovn-controller/0.log" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.836434 4851 generic.go:334] "Generic (PLEG): container finished" podID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerID="e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69" exitCode=0 Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.836466 4851 generic.go:334] "Generic (PLEG): container finished" podID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerID="f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9" exitCode=0 Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.836475 4851 generic.go:334] "Generic (PLEG): container finished" podID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" containerID="d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5" exitCode=0 Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.836573 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerDied","Data":"e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69"} Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.836680 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.836704 4851 scope.go:117] "RemoveContainer" containerID="726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.836683 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerDied","Data":"f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9"} Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.836877 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerDied","Data":"d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5"} Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.836929 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s78wn" event={"ID":"eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9","Type":"ContainerDied","Data":"36ab8cd424e79f64cdab995ff2d93a564be1178f4ef7cf4a4bf7d52cac2b75da"} Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.842204 4851 generic.go:334] "Generic (PLEG): container finished" podID="6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0" containerID="c7c6e66fb885f7c5198dbea4803f7eaafdf1d4263dc60d68ac12136246d67b26" exitCode=0 Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.842305 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" event={"ID":"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0","Type":"ContainerDied","Data":"c7c6e66fb885f7c5198dbea4803f7eaafdf1d4263dc60d68ac12136246d67b26"} Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.842344 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" event={"ID":"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0","Type":"ContainerStarted","Data":"ce459c5facc215bbeaad4cef7389ffb0a95993e7f1b3ee9da25a13c5357f9568"} Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.845117 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t5vvf_f68f162a-4e04-41d2-8197-95bac24aad23/kube-multus/2.log" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.875494 4851 scope.go:117] "RemoveContainer" containerID="e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.915535 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-s78wn"] Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.919647 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-s78wn"] Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.932024 4851 scope.go:117] "RemoveContainer" containerID="f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.960321 4851 scope.go:117] "RemoveContainer" containerID="d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5" Oct 01 13:03:07 crc kubenswrapper[4851]: I1001 13:03:07.983065 4851 scope.go:117] "RemoveContainer" containerID="0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.004373 4851 scope.go:117] "RemoveContainer" containerID="6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.035906 4851 scope.go:117] "RemoveContainer" containerID="e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.057827 4851 scope.go:117] "RemoveContainer" containerID="96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.080003 4851 scope.go:117] "RemoveContainer" containerID="d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.102569 4851 scope.go:117] "RemoveContainer" containerID="726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5" Oct 01 13:03:08 crc kubenswrapper[4851]: E1001 13:03:08.103014 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5\": container with ID starting with 726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5 not found: ID does not exist" containerID="726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.103129 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5"} err="failed to get container status \"726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5\": rpc error: code = NotFound desc = could not find container \"726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5\": container with ID starting with 726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.103161 4851 scope.go:117] "RemoveContainer" containerID="e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69" Oct 01 13:03:08 crc kubenswrapper[4851]: E1001 13:03:08.103469 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\": container with ID starting with e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69 not found: ID does not exist" containerID="e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.103522 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69"} err="failed to get container status \"e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\": rpc error: code = NotFound desc = could not find container \"e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\": container with ID starting with e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.103554 4851 scope.go:117] "RemoveContainer" containerID="f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9" Oct 01 13:03:08 crc kubenswrapper[4851]: E1001 13:03:08.103825 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\": container with ID starting with f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9 not found: ID does not exist" containerID="f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.103865 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9"} err="failed to get container status \"f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\": rpc error: code = NotFound desc = could not find container \"f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\": container with ID starting with f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.103891 4851 scope.go:117] "RemoveContainer" containerID="d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5" Oct 01 13:03:08 crc kubenswrapper[4851]: E1001 13:03:08.104102 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\": container with ID starting with d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5 not found: ID does not exist" containerID="d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.104129 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5"} err="failed to get container status \"d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\": rpc error: code = NotFound desc = could not find container \"d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\": container with ID starting with d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.104145 4851 scope.go:117] "RemoveContainer" containerID="0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f" Oct 01 13:03:08 crc kubenswrapper[4851]: E1001 13:03:08.104392 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\": container with ID starting with 0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f not found: ID does not exist" containerID="0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.104419 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f"} err="failed to get container status \"0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\": rpc error: code = NotFound desc = could not find container \"0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\": container with ID starting with 0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.104436 4851 scope.go:117] "RemoveContainer" containerID="6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124" Oct 01 13:03:08 crc kubenswrapper[4851]: E1001 13:03:08.104751 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\": container with ID starting with 6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124 not found: ID does not exist" containerID="6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.104771 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124"} err="failed to get container status \"6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\": rpc error: code = NotFound desc = could not find container \"6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\": container with ID starting with 6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.104785 4851 scope.go:117] "RemoveContainer" containerID="e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a" Oct 01 13:03:08 crc kubenswrapper[4851]: E1001 13:03:08.105123 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\": container with ID starting with e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a not found: ID does not exist" containerID="e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.105158 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a"} err="failed to get container status \"e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\": rpc error: code = NotFound desc = could not find container \"e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\": container with ID starting with e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.105172 4851 scope.go:117] "RemoveContainer" containerID="96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b" Oct 01 13:03:08 crc kubenswrapper[4851]: E1001 13:03:08.105435 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\": container with ID starting with 96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b not found: ID does not exist" containerID="96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.105480 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b"} err="failed to get container status \"96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\": rpc error: code = NotFound desc = could not find container \"96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\": container with ID starting with 96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.105533 4851 scope.go:117] "RemoveContainer" containerID="d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f" Oct 01 13:03:08 crc kubenswrapper[4851]: E1001 13:03:08.105785 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\": container with ID starting with d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f not found: ID does not exist" containerID="d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.105806 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f"} err="failed to get container status \"d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\": rpc error: code = NotFound desc = could not find container \"d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\": container with ID starting with d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.105820 4851 scope.go:117] "RemoveContainer" containerID="726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.105971 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5"} err="failed to get container status \"726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5\": rpc error: code = NotFound desc = could not find container \"726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5\": container with ID starting with 726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.105990 4851 scope.go:117] "RemoveContainer" containerID="e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.106178 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69"} err="failed to get container status \"e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\": rpc error: code = NotFound desc = could not find container \"e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\": container with ID starting with e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.106196 4851 scope.go:117] "RemoveContainer" containerID="f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.106747 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9"} err="failed to get container status \"f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\": rpc error: code = NotFound desc = could not find container \"f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\": container with ID starting with f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.106817 4851 scope.go:117] "RemoveContainer" containerID="d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.107289 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5"} err="failed to get container status \"d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\": rpc error: code = NotFound desc = could not find container \"d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\": container with ID starting with d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.107311 4851 scope.go:117] "RemoveContainer" containerID="0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.107800 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f"} err="failed to get container status \"0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\": rpc error: code = NotFound desc = could not find container \"0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\": container with ID starting with 0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.107823 4851 scope.go:117] "RemoveContainer" containerID="6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.108067 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124"} err="failed to get container status \"6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\": rpc error: code = NotFound desc = could not find container \"6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\": container with ID starting with 6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.108101 4851 scope.go:117] "RemoveContainer" containerID="e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.108427 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a"} err="failed to get container status \"e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\": rpc error: code = NotFound desc = could not find container \"e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\": container with ID starting with e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.108450 4851 scope.go:117] "RemoveContainer" containerID="96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.108756 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b"} err="failed to get container status \"96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\": rpc error: code = NotFound desc = could not find container \"96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\": container with ID starting with 96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.108787 4851 scope.go:117] "RemoveContainer" containerID="d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.109078 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f"} err="failed to get container status \"d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\": rpc error: code = NotFound desc = could not find container \"d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\": container with ID starting with d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.109106 4851 scope.go:117] "RemoveContainer" containerID="726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.109358 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5"} err="failed to get container status \"726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5\": rpc error: code = NotFound desc = could not find container \"726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5\": container with ID starting with 726fb0384f8f37b8eab074f69a293d36119a932c3d959c15f107c3c100f670b5 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.109384 4851 scope.go:117] "RemoveContainer" containerID="e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.109648 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69"} err="failed to get container status \"e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\": rpc error: code = NotFound desc = could not find container \"e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69\": container with ID starting with e5495096926e0b5c4ae7ae0d6b38435288f97ef37570eb6c2670cbdf24117e69 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.109691 4851 scope.go:117] "RemoveContainer" containerID="f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.109968 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9"} err="failed to get container status \"f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\": rpc error: code = NotFound desc = could not find container \"f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9\": container with ID starting with f9a560d10123c4f520d88b6e17865cb81f07972e6f034904e209d49f85e82ff9 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.110006 4851 scope.go:117] "RemoveContainer" containerID="d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.110293 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5"} err="failed to get container status \"d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\": rpc error: code = NotFound desc = could not find container \"d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5\": container with ID starting with d574d0a1f26928616b67b41d8ac74460b8a395d78b2e365f398cdbd1abed88f5 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.110320 4851 scope.go:117] "RemoveContainer" containerID="0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.110586 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f"} err="failed to get container status \"0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\": rpc error: code = NotFound desc = could not find container \"0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f\": container with ID starting with 0ae08b19eb92c9d07ef12c660ad1a89ce547cacd38b04fe500c9a3aeebd8d04f not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.110620 4851 scope.go:117] "RemoveContainer" containerID="6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.110922 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124"} err="failed to get container status \"6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\": rpc error: code = NotFound desc = could not find container \"6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124\": container with ID starting with 6371b16e4bd33f1ccb0d222f10c000ecefd1dd260fec0cfd124d2e1377e0d124 not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.110959 4851 scope.go:117] "RemoveContainer" containerID="e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.111488 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a"} err="failed to get container status \"e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\": rpc error: code = NotFound desc = could not find container \"e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a\": container with ID starting with e19491ddadb04af4cfb01b59ac12be4134f9967aaba276e246a4055686f4a01a not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.111541 4851 scope.go:117] "RemoveContainer" containerID="96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.111807 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b"} err="failed to get container status \"96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\": rpc error: code = NotFound desc = could not find container \"96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b\": container with ID starting with 96b3d1c7c9ed70c042fb6c6fef4066f9f8b36c43578a1465bd24e305221efb9b not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.111847 4851 scope.go:117] "RemoveContainer" containerID="d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.112092 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f"} err="failed to get container status \"d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\": rpc error: code = NotFound desc = could not find container \"d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f\": container with ID starting with d11a357c934ca5d74cebd87f10459962fe2200b8b48e4c1c0a029f5af33fcf7f not found: ID does not exist" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.339158 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9" path="/var/lib/kubelet/pods/eff1f44a-a0e9-4b4f-9d74-ab1c2bf4b1c9/volumes" Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.854744 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" event={"ID":"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0","Type":"ContainerStarted","Data":"558fe1726123044941b2118d83bbdb26b95a3a4a88a2bfa973c215f9ccdf4f05"} Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.855494 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" event={"ID":"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0","Type":"ContainerStarted","Data":"f1dfa5c93e714ad71d8cb0ce60e6209774fa1c57eb0bf5c6ea1a6a4ab3eb2e79"} Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.855533 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" event={"ID":"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0","Type":"ContainerStarted","Data":"03b56800f77b094884d9a6821420defe581186ef8575ff68c12b75ea03bf613b"} Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.855550 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" event={"ID":"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0","Type":"ContainerStarted","Data":"851a2fd1dfe45451f67e233a884f488a59e6cb4ddf86757cb3e1849163631336"} Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.855565 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" event={"ID":"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0","Type":"ContainerStarted","Data":"c8270951d6239f20539fe320ee2be2c0a9a50ed97797c2f49b6e80a06852de86"} Oct 01 13:03:08 crc kubenswrapper[4851]: I1001 13:03:08.855579 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" event={"ID":"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0","Type":"ContainerStarted","Data":"fa9ade4b7917f4bd5fa51858a45cb806ce9f8453a0d97d2440ebb89b32308dd4"} Oct 01 13:03:11 crc kubenswrapper[4851]: I1001 13:03:11.884171 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" event={"ID":"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0","Type":"ContainerStarted","Data":"599667c5ff59f77c07e05959a31ce24c148b1f450deb269b34d24282b8c10436"} Oct 01 13:03:13 crc kubenswrapper[4851]: I1001 13:03:13.900056 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" event={"ID":"6ca4cb80-dfa5-410c-9bd9-87d2237c7cc0","Type":"ContainerStarted","Data":"ae4e5c04867c4b8964c667a8cca857dfc7738322816e6707fd2b3c7ef7c87cae"} Oct 01 13:03:13 crc kubenswrapper[4851]: I1001 13:03:13.900547 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:13 crc kubenswrapper[4851]: I1001 13:03:13.900559 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:13 crc kubenswrapper[4851]: I1001 13:03:13.928147 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" podStartSLOduration=6.928127788 podStartE2EDuration="6.928127788s" podCreationTimestamp="2025-10-01 13:03:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:03:13.926910633 +0000 UTC m=+602.272028129" watchObservedRunningTime="2025-10-01 13:03:13.928127788 +0000 UTC m=+602.273245274" Oct 01 13:03:13 crc kubenswrapper[4851]: I1001 13:03:13.933289 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:13 crc kubenswrapper[4851]: I1001 13:03:13.934907 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:14 crc kubenswrapper[4851]: I1001 13:03:14.905969 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:22 crc kubenswrapper[4851]: I1001 13:03:22.331718 4851 scope.go:117] "RemoveContainer" containerID="67fb38d621be1a818ac5d4d426ab4652a41b195832d2e760e8388c776796f58c" Oct 01 13:03:22 crc kubenswrapper[4851]: E1001 13:03:22.332600 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-t5vvf_openshift-multus(f68f162a-4e04-41d2-8197-95bac24aad23)\"" pod="openshift-multus/multus-t5vvf" podUID="f68f162a-4e04-41d2-8197-95bac24aad23" Oct 01 13:03:34 crc kubenswrapper[4851]: I1001 13:03:34.762075 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9"] Oct 01 13:03:34 crc kubenswrapper[4851]: I1001 13:03:34.764605 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:34 crc kubenswrapper[4851]: I1001 13:03:34.770602 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 13:03:34 crc kubenswrapper[4851]: I1001 13:03:34.778893 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9"] Oct 01 13:03:34 crc kubenswrapper[4851]: I1001 13:03:34.834718 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f9ph\" (UniqueName: \"kubernetes.io/projected/d3aab3b2-075f-4afa-969e-3e32803601b9-kube-api-access-2f9ph\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9\" (UID: \"d3aab3b2-075f-4afa-969e-3e32803601b9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:34 crc kubenswrapper[4851]: I1001 13:03:34.834825 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3aab3b2-075f-4afa-969e-3e32803601b9-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9\" (UID: \"d3aab3b2-075f-4afa-969e-3e32803601b9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:34 crc kubenswrapper[4851]: I1001 13:03:34.834886 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3aab3b2-075f-4afa-969e-3e32803601b9-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9\" (UID: \"d3aab3b2-075f-4afa-969e-3e32803601b9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:34 crc kubenswrapper[4851]: I1001 13:03:34.935739 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3aab3b2-075f-4afa-969e-3e32803601b9-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9\" (UID: \"d3aab3b2-075f-4afa-969e-3e32803601b9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:34 crc kubenswrapper[4851]: I1001 13:03:34.935816 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3aab3b2-075f-4afa-969e-3e32803601b9-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9\" (UID: \"d3aab3b2-075f-4afa-969e-3e32803601b9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:34 crc kubenswrapper[4851]: I1001 13:03:34.935951 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f9ph\" (UniqueName: \"kubernetes.io/projected/d3aab3b2-075f-4afa-969e-3e32803601b9-kube-api-access-2f9ph\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9\" (UID: \"d3aab3b2-075f-4afa-969e-3e32803601b9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:34 crc kubenswrapper[4851]: I1001 13:03:34.936914 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3aab3b2-075f-4afa-969e-3e32803601b9-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9\" (UID: \"d3aab3b2-075f-4afa-969e-3e32803601b9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:34 crc kubenswrapper[4851]: I1001 13:03:34.937120 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3aab3b2-075f-4afa-969e-3e32803601b9-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9\" (UID: \"d3aab3b2-075f-4afa-969e-3e32803601b9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:34 crc kubenswrapper[4851]: I1001 13:03:34.962847 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f9ph\" (UniqueName: \"kubernetes.io/projected/d3aab3b2-075f-4afa-969e-3e32803601b9-kube-api-access-2f9ph\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9\" (UID: \"d3aab3b2-075f-4afa-969e-3e32803601b9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:35 crc kubenswrapper[4851]: I1001 13:03:35.096450 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:35 crc kubenswrapper[4851]: E1001 13:03:35.127601 4851 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_openshift-marketplace_d3aab3b2-075f-4afa-969e-3e32803601b9_0(e30aa012b1b7001cad28a982740b2a0e6879bdfb08dff51a6f4ed33ef4368e9a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 13:03:35 crc kubenswrapper[4851]: E1001 13:03:35.127703 4851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_openshift-marketplace_d3aab3b2-075f-4afa-969e-3e32803601b9_0(e30aa012b1b7001cad28a982740b2a0e6879bdfb08dff51a6f4ed33ef4368e9a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:35 crc kubenswrapper[4851]: E1001 13:03:35.127742 4851 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_openshift-marketplace_d3aab3b2-075f-4afa-969e-3e32803601b9_0(e30aa012b1b7001cad28a982740b2a0e6879bdfb08dff51a6f4ed33ef4368e9a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:35 crc kubenswrapper[4851]: E1001 13:03:35.127822 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_openshift-marketplace(d3aab3b2-075f-4afa-969e-3e32803601b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_openshift-marketplace(d3aab3b2-075f-4afa-969e-3e32803601b9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_openshift-marketplace_d3aab3b2-075f-4afa-969e-3e32803601b9_0(e30aa012b1b7001cad28a982740b2a0e6879bdfb08dff51a6f4ed33ef4368e9a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" podUID="d3aab3b2-075f-4afa-969e-3e32803601b9" Oct 01 13:03:36 crc kubenswrapper[4851]: I1001 13:03:36.063591 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:36 crc kubenswrapper[4851]: I1001 13:03:36.063982 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:36 crc kubenswrapper[4851]: E1001 13:03:36.106269 4851 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_openshift-marketplace_d3aab3b2-075f-4afa-969e-3e32803601b9_0(b7bab8c21991afcd14d0468a8a8f211be46c2cac821f17e7c8d9177c5d06bc17): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 13:03:36 crc kubenswrapper[4851]: E1001 13:03:36.106349 4851 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_openshift-marketplace_d3aab3b2-075f-4afa-969e-3e32803601b9_0(b7bab8c21991afcd14d0468a8a8f211be46c2cac821f17e7c8d9177c5d06bc17): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:36 crc kubenswrapper[4851]: E1001 13:03:36.106392 4851 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_openshift-marketplace_d3aab3b2-075f-4afa-969e-3e32803601b9_0(b7bab8c21991afcd14d0468a8a8f211be46c2cac821f17e7c8d9177c5d06bc17): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:36 crc kubenswrapper[4851]: E1001 13:03:36.106469 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_openshift-marketplace(d3aab3b2-075f-4afa-969e-3e32803601b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_openshift-marketplace(d3aab3b2-075f-4afa-969e-3e32803601b9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_openshift-marketplace_d3aab3b2-075f-4afa-969e-3e32803601b9_0(b7bab8c21991afcd14d0468a8a8f211be46c2cac821f17e7c8d9177c5d06bc17): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" podUID="d3aab3b2-075f-4afa-969e-3e32803601b9" Oct 01 13:03:37 crc kubenswrapper[4851]: I1001 13:03:37.328548 4851 scope.go:117] "RemoveContainer" containerID="67fb38d621be1a818ac5d4d426ab4652a41b195832d2e760e8388c776796f58c" Oct 01 13:03:37 crc kubenswrapper[4851]: I1001 13:03:37.399195 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ktkj4" Oct 01 13:03:38 crc kubenswrapper[4851]: I1001 13:03:38.074766 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t5vvf_f68f162a-4e04-41d2-8197-95bac24aad23/kube-multus/2.log" Oct 01 13:03:38 crc kubenswrapper[4851]: I1001 13:03:38.074828 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t5vvf" event={"ID":"f68f162a-4e04-41d2-8197-95bac24aad23","Type":"ContainerStarted","Data":"bd15bb4c954b5721e3e390a378b0fa2190f26e4b0ffc34eb07643bc7b2cef540"} Oct 01 13:03:47 crc kubenswrapper[4851]: I1001 13:03:47.328273 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:47 crc kubenswrapper[4851]: I1001 13:03:47.330086 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:47 crc kubenswrapper[4851]: I1001 13:03:47.611416 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9"] Oct 01 13:03:48 crc kubenswrapper[4851]: I1001 13:03:48.141832 4851 generic.go:334] "Generic (PLEG): container finished" podID="d3aab3b2-075f-4afa-969e-3e32803601b9" containerID="b845fec5f7a68c3f4939f329a57045c620ac603ede968541de6b29f3d90f385d" exitCode=0 Oct 01 13:03:48 crc kubenswrapper[4851]: I1001 13:03:48.141881 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" event={"ID":"d3aab3b2-075f-4afa-969e-3e32803601b9","Type":"ContainerDied","Data":"b845fec5f7a68c3f4939f329a57045c620ac603ede968541de6b29f3d90f385d"} Oct 01 13:03:48 crc kubenswrapper[4851]: I1001 13:03:48.141909 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" event={"ID":"d3aab3b2-075f-4afa-969e-3e32803601b9","Type":"ContainerStarted","Data":"11966d6f87512d4246c489504cd2a7a8adb7bc230ecef8ac17105cb5014b848a"} Oct 01 13:03:52 crc kubenswrapper[4851]: I1001 13:03:52.174462 4851 generic.go:334] "Generic (PLEG): container finished" podID="d3aab3b2-075f-4afa-969e-3e32803601b9" containerID="db502d08321dc96c66e7d65123e9eee52f3576633e0f86c95e462b98f3b4d33d" exitCode=0 Oct 01 13:03:52 crc kubenswrapper[4851]: I1001 13:03:52.174567 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" event={"ID":"d3aab3b2-075f-4afa-969e-3e32803601b9","Type":"ContainerDied","Data":"db502d08321dc96c66e7d65123e9eee52f3576633e0f86c95e462b98f3b4d33d"} Oct 01 13:03:53 crc kubenswrapper[4851]: I1001 13:03:53.184410 4851 generic.go:334] "Generic (PLEG): container finished" podID="d3aab3b2-075f-4afa-969e-3e32803601b9" containerID="d61cde211b222d1bc040b464ac920c97b5a2b983afcfdbadeafd4d63045408c0" exitCode=0 Oct 01 13:03:53 crc kubenswrapper[4851]: I1001 13:03:53.184459 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" event={"ID":"d3aab3b2-075f-4afa-969e-3e32803601b9","Type":"ContainerDied","Data":"d61cde211b222d1bc040b464ac920c97b5a2b983afcfdbadeafd4d63045408c0"} Oct 01 13:03:54 crc kubenswrapper[4851]: I1001 13:03:54.484788 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:03:54 crc kubenswrapper[4851]: I1001 13:03:54.537941 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3aab3b2-075f-4afa-969e-3e32803601b9-bundle\") pod \"d3aab3b2-075f-4afa-969e-3e32803601b9\" (UID: \"d3aab3b2-075f-4afa-969e-3e32803601b9\") " Oct 01 13:03:54 crc kubenswrapper[4851]: I1001 13:03:54.538016 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f9ph\" (UniqueName: \"kubernetes.io/projected/d3aab3b2-075f-4afa-969e-3e32803601b9-kube-api-access-2f9ph\") pod \"d3aab3b2-075f-4afa-969e-3e32803601b9\" (UID: \"d3aab3b2-075f-4afa-969e-3e32803601b9\") " Oct 01 13:03:54 crc kubenswrapper[4851]: I1001 13:03:54.538057 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3aab3b2-075f-4afa-969e-3e32803601b9-util\") pod \"d3aab3b2-075f-4afa-969e-3e32803601b9\" (UID: \"d3aab3b2-075f-4afa-969e-3e32803601b9\") " Oct 01 13:03:54 crc kubenswrapper[4851]: I1001 13:03:54.539856 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3aab3b2-075f-4afa-969e-3e32803601b9-bundle" (OuterVolumeSpecName: "bundle") pod "d3aab3b2-075f-4afa-969e-3e32803601b9" (UID: "d3aab3b2-075f-4afa-969e-3e32803601b9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:03:54 crc kubenswrapper[4851]: I1001 13:03:54.546160 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3aab3b2-075f-4afa-969e-3e32803601b9-kube-api-access-2f9ph" (OuterVolumeSpecName: "kube-api-access-2f9ph") pod "d3aab3b2-075f-4afa-969e-3e32803601b9" (UID: "d3aab3b2-075f-4afa-969e-3e32803601b9"). InnerVolumeSpecName "kube-api-access-2f9ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:03:54 crc kubenswrapper[4851]: I1001 13:03:54.557133 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3aab3b2-075f-4afa-969e-3e32803601b9-util" (OuterVolumeSpecName: "util") pod "d3aab3b2-075f-4afa-969e-3e32803601b9" (UID: "d3aab3b2-075f-4afa-969e-3e32803601b9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:03:54 crc kubenswrapper[4851]: I1001 13:03:54.639372 4851 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3aab3b2-075f-4afa-969e-3e32803601b9-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:54 crc kubenswrapper[4851]: I1001 13:03:54.639413 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f9ph\" (UniqueName: \"kubernetes.io/projected/d3aab3b2-075f-4afa-969e-3e32803601b9-kube-api-access-2f9ph\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:54 crc kubenswrapper[4851]: I1001 13:03:54.639429 4851 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3aab3b2-075f-4afa-969e-3e32803601b9-util\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:55 crc kubenswrapper[4851]: I1001 13:03:55.200711 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" event={"ID":"d3aab3b2-075f-4afa-969e-3e32803601b9","Type":"ContainerDied","Data":"11966d6f87512d4246c489504cd2a7a8adb7bc230ecef8ac17105cb5014b848a"} Oct 01 13:03:55 crc kubenswrapper[4851]: I1001 13:03:55.200770 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11966d6f87512d4246c489504cd2a7a8adb7bc230ecef8ac17105cb5014b848a" Oct 01 13:03:55 crc kubenswrapper[4851]: I1001 13:03:55.200804 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.180153 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-2tptp"] Oct 01 13:04:07 crc kubenswrapper[4851]: E1001 13:04:07.180787 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3aab3b2-075f-4afa-969e-3e32803601b9" containerName="util" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.180799 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3aab3b2-075f-4afa-969e-3e32803601b9" containerName="util" Oct 01 13:04:07 crc kubenswrapper[4851]: E1001 13:04:07.180809 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3aab3b2-075f-4afa-969e-3e32803601b9" containerName="pull" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.180814 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3aab3b2-075f-4afa-969e-3e32803601b9" containerName="pull" Oct 01 13:04:07 crc kubenswrapper[4851]: E1001 13:04:07.180826 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3aab3b2-075f-4afa-969e-3e32803601b9" containerName="extract" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.180832 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3aab3b2-075f-4afa-969e-3e32803601b9" containerName="extract" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.180919 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3aab3b2-075f-4afa-969e-3e32803601b9" containerName="extract" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.181233 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2tptp" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.184875 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-6zs8r" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.187933 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.188039 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.202367 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-2tptp"] Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.299560 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-mf765"] Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.300376 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-mf765" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.303034 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-xqxvb" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.303033 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.318951 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n"] Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.319639 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.321532 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-mf765"] Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.326108 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt7mc\" (UniqueName: \"kubernetes.io/projected/6dc06d35-7a66-4c6a-bd91-1de635f2160f-kube-api-access-jt7mc\") pod \"obo-prometheus-operator-7c8cf85677-2tptp\" (UID: \"6dc06d35-7a66-4c6a-bd91-1de635f2160f\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2tptp" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.342466 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n"] Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.426860 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd26d552-d668-4eee-b51c-468b0f48d5e7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n\" (UID: \"fd26d552-d668-4eee-b51c-468b0f48d5e7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.426918 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5797d328-2922-4183-86d5-7d237952df39-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6dd5789d46-mf765\" (UID: \"5797d328-2922-4183-86d5-7d237952df39\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-mf765" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.426952 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd26d552-d668-4eee-b51c-468b0f48d5e7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n\" (UID: \"fd26d552-d668-4eee-b51c-468b0f48d5e7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.427165 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt7mc\" (UniqueName: \"kubernetes.io/projected/6dc06d35-7a66-4c6a-bd91-1de635f2160f-kube-api-access-jt7mc\") pod \"obo-prometheus-operator-7c8cf85677-2tptp\" (UID: \"6dc06d35-7a66-4c6a-bd91-1de635f2160f\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2tptp" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.427213 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5797d328-2922-4183-86d5-7d237952df39-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6dd5789d46-mf765\" (UID: \"5797d328-2922-4183-86d5-7d237952df39\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-mf765" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.459749 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt7mc\" (UniqueName: \"kubernetes.io/projected/6dc06d35-7a66-4c6a-bd91-1de635f2160f-kube-api-access-jt7mc\") pod \"obo-prometheus-operator-7c8cf85677-2tptp\" (UID: \"6dc06d35-7a66-4c6a-bd91-1de635f2160f\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2tptp" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.494453 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-9lt4m"] Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.495249 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-9lt4m" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.498906 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-wgr8g" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.501718 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2tptp" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.506924 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-9lt4m"] Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.513201 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.530409 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5797d328-2922-4183-86d5-7d237952df39-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6dd5789d46-mf765\" (UID: \"5797d328-2922-4183-86d5-7d237952df39\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-mf765" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.530456 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd26d552-d668-4eee-b51c-468b0f48d5e7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n\" (UID: \"fd26d552-d668-4eee-b51c-468b0f48d5e7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.530517 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5797d328-2922-4183-86d5-7d237952df39-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6dd5789d46-mf765\" (UID: \"5797d328-2922-4183-86d5-7d237952df39\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-mf765" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.530554 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd26d552-d668-4eee-b51c-468b0f48d5e7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n\" (UID: \"fd26d552-d668-4eee-b51c-468b0f48d5e7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.535982 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd26d552-d668-4eee-b51c-468b0f48d5e7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n\" (UID: \"fd26d552-d668-4eee-b51c-468b0f48d5e7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.536514 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5797d328-2922-4183-86d5-7d237952df39-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6dd5789d46-mf765\" (UID: \"5797d328-2922-4183-86d5-7d237952df39\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-mf765" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.540135 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5797d328-2922-4183-86d5-7d237952df39-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6dd5789d46-mf765\" (UID: \"5797d328-2922-4183-86d5-7d237952df39\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-mf765" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.540538 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd26d552-d668-4eee-b51c-468b0f48d5e7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n\" (UID: \"fd26d552-d668-4eee-b51c-468b0f48d5e7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.593352 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-t5ld5"] Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.593996 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.596796 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-m8hm7" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.604650 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-t5ld5"] Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.614711 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-mf765" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.631552 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tttgg\" (UniqueName: \"kubernetes.io/projected/d6d5593d-78e7-4efe-a8b1-f56bd214301b-kube-api-access-tttgg\") pod \"observability-operator-cc5f78dfc-9lt4m\" (UID: \"d6d5593d-78e7-4efe-a8b1-f56bd214301b\") " pod="openshift-operators/observability-operator-cc5f78dfc-9lt4m" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.631621 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6d5593d-78e7-4efe-a8b1-f56bd214301b-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-9lt4m\" (UID: \"d6d5593d-78e7-4efe-a8b1-f56bd214301b\") " pod="openshift-operators/observability-operator-cc5f78dfc-9lt4m" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.633277 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.732403 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tttgg\" (UniqueName: \"kubernetes.io/projected/d6d5593d-78e7-4efe-a8b1-f56bd214301b-kube-api-access-tttgg\") pod \"observability-operator-cc5f78dfc-9lt4m\" (UID: \"d6d5593d-78e7-4efe-a8b1-f56bd214301b\") " pod="openshift-operators/observability-operator-cc5f78dfc-9lt4m" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.732688 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/6280e32a-cded-420b-8a4b-f65505578cf0-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-t5ld5\" (UID: \"6280e32a-cded-420b-8a4b-f65505578cf0\") " pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.732706 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z6s8\" (UniqueName: \"kubernetes.io/projected/6280e32a-cded-420b-8a4b-f65505578cf0-kube-api-access-9z6s8\") pod \"perses-operator-54bc95c9fb-t5ld5\" (UID: \"6280e32a-cded-420b-8a4b-f65505578cf0\") " pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.732748 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6d5593d-78e7-4efe-a8b1-f56bd214301b-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-9lt4m\" (UID: \"d6d5593d-78e7-4efe-a8b1-f56bd214301b\") " pod="openshift-operators/observability-operator-cc5f78dfc-9lt4m" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.735898 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6d5593d-78e7-4efe-a8b1-f56bd214301b-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-9lt4m\" (UID: \"d6d5593d-78e7-4efe-a8b1-f56bd214301b\") " pod="openshift-operators/observability-operator-cc5f78dfc-9lt4m" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.747177 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tttgg\" (UniqueName: \"kubernetes.io/projected/d6d5593d-78e7-4efe-a8b1-f56bd214301b-kube-api-access-tttgg\") pod \"observability-operator-cc5f78dfc-9lt4m\" (UID: \"d6d5593d-78e7-4efe-a8b1-f56bd214301b\") " pod="openshift-operators/observability-operator-cc5f78dfc-9lt4m" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.813840 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-9lt4m" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.833939 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/6280e32a-cded-420b-8a4b-f65505578cf0-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-t5ld5\" (UID: \"6280e32a-cded-420b-8a4b-f65505578cf0\") " pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.833981 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z6s8\" (UniqueName: \"kubernetes.io/projected/6280e32a-cded-420b-8a4b-f65505578cf0-kube-api-access-9z6s8\") pod \"perses-operator-54bc95c9fb-t5ld5\" (UID: \"6280e32a-cded-420b-8a4b-f65505578cf0\") " pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.834847 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/6280e32a-cded-420b-8a4b-f65505578cf0-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-t5ld5\" (UID: \"6280e32a-cded-420b-8a4b-f65505578cf0\") " pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.849627 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z6s8\" (UniqueName: \"kubernetes.io/projected/6280e32a-cded-420b-8a4b-f65505578cf0-kube-api-access-9z6s8\") pod \"perses-operator-54bc95c9fb-t5ld5\" (UID: \"6280e32a-cded-420b-8a4b-f65505578cf0\") " pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" Oct 01 13:04:07 crc kubenswrapper[4851]: I1001 13:04:07.912031 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" Oct 01 13:04:08 crc kubenswrapper[4851]: I1001 13:04:08.016899 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-2tptp"] Oct 01 13:04:08 crc kubenswrapper[4851]: I1001 13:04:08.082023 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-9lt4m"] Oct 01 13:04:08 crc kubenswrapper[4851]: I1001 13:04:08.102613 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-mf765"] Oct 01 13:04:08 crc kubenswrapper[4851]: I1001 13:04:08.106617 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n"] Oct 01 13:04:08 crc kubenswrapper[4851]: I1001 13:04:08.199867 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-t5ld5"] Oct 01 13:04:08 crc kubenswrapper[4851]: I1001 13:04:08.318430 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-9lt4m" event={"ID":"d6d5593d-78e7-4efe-a8b1-f56bd214301b","Type":"ContainerStarted","Data":"6e6c58a448aab555517e7af66797f9655927e62c7d3358ce6ffa6c3ac52cab28"} Oct 01 13:04:08 crc kubenswrapper[4851]: I1001 13:04:08.319610 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" event={"ID":"6280e32a-cded-420b-8a4b-f65505578cf0","Type":"ContainerStarted","Data":"2aad4fa068e19b7cdd9fb9159e671d34017239c3970af5582d3c3c9cf37b841f"} Oct 01 13:04:08 crc kubenswrapper[4851]: I1001 13:04:08.320634 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n" event={"ID":"fd26d552-d668-4eee-b51c-468b0f48d5e7","Type":"ContainerStarted","Data":"a135ff53661d15c92af59eb375e0b1020a84dd1ed9ad1cd98f25027ff21aa919"} Oct 01 13:04:08 crc kubenswrapper[4851]: I1001 13:04:08.321614 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2tptp" event={"ID":"6dc06d35-7a66-4c6a-bd91-1de635f2160f","Type":"ContainerStarted","Data":"93caebeab634607a1d049431c156eeef12131580e136a5cebd95fd2dba30ee26"} Oct 01 13:04:08 crc kubenswrapper[4851]: I1001 13:04:08.323186 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-mf765" event={"ID":"5797d328-2922-4183-86d5-7d237952df39","Type":"ContainerStarted","Data":"726a60b2e7e23611b78d10ac16cd30ae38a88ecbd7fc12460a9a584d49fbb096"} Oct 01 13:04:22 crc kubenswrapper[4851]: E1001 13:04:22.833283 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-0-1-rhel9-operator@sha256:bfed9f442aea6e8165644f1dc615beea06ec7fd84ea3f8ca393a63d3627c6a7c" Oct 01 13:04:22 crc kubenswrapper[4851]: E1001 13:04:22.834129 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-0-1-rhel9-operator@sha256:bfed9f442aea6e8165644f1dc615beea06ec7fd84ea3f8ca393a63d3627c6a7c,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9z6s8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-54bc95c9fb-t5ld5_openshift-operators(6280e32a-cded-420b-8a4b-f65505578cf0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:04:22 crc kubenswrapper[4851]: E1001 13:04:22.835366 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" podUID="6280e32a-cded-420b-8a4b-f65505578cf0" Oct 01 13:04:23 crc kubenswrapper[4851]: I1001 13:04:23.453987 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-9lt4m" event={"ID":"d6d5593d-78e7-4efe-a8b1-f56bd214301b","Type":"ContainerStarted","Data":"9cd0066f13b00998d17ce969cd4b22deee2740611b005b9003111e6590e5843c"} Oct 01 13:04:23 crc kubenswrapper[4851]: I1001 13:04:23.454245 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-9lt4m" Oct 01 13:04:23 crc kubenswrapper[4851]: I1001 13:04:23.455528 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n" event={"ID":"fd26d552-d668-4eee-b51c-468b0f48d5e7","Type":"ContainerStarted","Data":"6de4261e751feaeaa4f5a94cf92f5541442e031e8469862901078cf57fdd24a5"} Oct 01 13:04:23 crc kubenswrapper[4851]: I1001 13:04:23.457200 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2tptp" event={"ID":"6dc06d35-7a66-4c6a-bd91-1de635f2160f","Type":"ContainerStarted","Data":"a6fc6d577a993b6f3fdc010443e9e12f9c321d43356116bc65f1d28f063de694"} Oct 01 13:04:23 crc kubenswrapper[4851]: I1001 13:04:23.458837 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-mf765" event={"ID":"5797d328-2922-4183-86d5-7d237952df39","Type":"ContainerStarted","Data":"b6ece30e45ee43c5d2323a79fbeab52af23085961160acded5f6e4e4c76bcf50"} Oct 01 13:04:23 crc kubenswrapper[4851]: E1001 13:04:23.460056 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-0-1-rhel9-operator@sha256:bfed9f442aea6e8165644f1dc615beea06ec7fd84ea3f8ca393a63d3627c6a7c\\\"\"" pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" podUID="6280e32a-cded-420b-8a4b-f65505578cf0" Oct 01 13:04:23 crc kubenswrapper[4851]: I1001 13:04:23.491241 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-9lt4m" Oct 01 13:04:23 crc kubenswrapper[4851]: I1001 13:04:23.500672 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-9lt4m" podStartSLOduration=1.7048887719999999 podStartE2EDuration="16.500654145s" podCreationTimestamp="2025-10-01 13:04:07 +0000 UTC" firstStartedPulling="2025-10-01 13:04:08.096360801 +0000 UTC m=+656.441478287" lastFinishedPulling="2025-10-01 13:04:22.892126174 +0000 UTC m=+671.237243660" observedRunningTime="2025-10-01 13:04:23.497120763 +0000 UTC m=+671.842238259" watchObservedRunningTime="2025-10-01 13:04:23.500654145 +0000 UTC m=+671.845771631" Oct 01 13:04:23 crc kubenswrapper[4851]: I1001 13:04:23.546410 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-mf765" podStartSLOduration=1.767024213 podStartE2EDuration="16.54639359s" podCreationTimestamp="2025-10-01 13:04:07 +0000 UTC" firstStartedPulling="2025-10-01 13:04:08.112770087 +0000 UTC m=+656.457887563" lastFinishedPulling="2025-10-01 13:04:22.892139454 +0000 UTC m=+671.237256940" observedRunningTime="2025-10-01 13:04:23.545464043 +0000 UTC m=+671.890581529" watchObservedRunningTime="2025-10-01 13:04:23.54639359 +0000 UTC m=+671.891511076" Oct 01 13:04:23 crc kubenswrapper[4851]: I1001 13:04:23.606793 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n" podStartSLOduration=1.808925798 podStartE2EDuration="16.6067741s" podCreationTimestamp="2025-10-01 13:04:07 +0000 UTC" firstStartedPulling="2025-10-01 13:04:08.135714202 +0000 UTC m=+656.480831688" lastFinishedPulling="2025-10-01 13:04:22.933562494 +0000 UTC m=+671.278679990" observedRunningTime="2025-10-01 13:04:23.605949196 +0000 UTC m=+671.951066682" watchObservedRunningTime="2025-10-01 13:04:23.6067741 +0000 UTC m=+671.951891586" Oct 01 13:04:35 crc kubenswrapper[4851]: I1001 13:04:35.366752 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2tptp" podStartSLOduration=13.521068230000001 podStartE2EDuration="28.366723858s" podCreationTimestamp="2025-10-01 13:04:07 +0000 UTC" firstStartedPulling="2025-10-01 13:04:08.045200749 +0000 UTC m=+656.390318235" lastFinishedPulling="2025-10-01 13:04:22.890856377 +0000 UTC m=+671.235973863" observedRunningTime="2025-10-01 13:04:23.632864566 +0000 UTC m=+671.977982062" watchObservedRunningTime="2025-10-01 13:04:35.366723858 +0000 UTC m=+683.711841384" Oct 01 13:04:37 crc kubenswrapper[4851]: I1001 13:04:37.565782 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" event={"ID":"6280e32a-cded-420b-8a4b-f65505578cf0","Type":"ContainerStarted","Data":"f5fe23b6508fd681a342bfc64cdc35e9e1b53d5eb2c42311418c1e5b9acbc247"} Oct 01 13:04:37 crc kubenswrapper[4851]: I1001 13:04:37.566385 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" Oct 01 13:04:37 crc kubenswrapper[4851]: I1001 13:04:37.588822 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" podStartSLOduration=2.39432403 podStartE2EDuration="30.588790814s" podCreationTimestamp="2025-10-01 13:04:07 +0000 UTC" firstStartedPulling="2025-10-01 13:04:08.222845116 +0000 UTC m=+656.567962602" lastFinishedPulling="2025-10-01 13:04:36.41731186 +0000 UTC m=+684.762429386" observedRunningTime="2025-10-01 13:04:37.58831236 +0000 UTC m=+685.933429876" watchObservedRunningTime="2025-10-01 13:04:37.588790814 +0000 UTC m=+685.933908350" Oct 01 13:04:47 crc kubenswrapper[4851]: I1001 13:04:47.916111 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-t5ld5" Oct 01 13:05:00 crc kubenswrapper[4851]: I1001 13:05:00.049961 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:05:00 crc kubenswrapper[4851]: I1001 13:05:00.050878 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:05:06 crc kubenswrapper[4851]: I1001 13:05:06.542209 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8"] Oct 01 13:05:06 crc kubenswrapper[4851]: I1001 13:05:06.544832 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" Oct 01 13:05:06 crc kubenswrapper[4851]: I1001 13:05:06.547861 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 13:05:06 crc kubenswrapper[4851]: I1001 13:05:06.559079 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8"] Oct 01 13:05:06 crc kubenswrapper[4851]: I1001 13:05:06.705904 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz99q\" (UniqueName: \"kubernetes.io/projected/127a1393-40c1-4ee6-84a6-9ccd6dee9595-kube-api-access-xz99q\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8\" (UID: \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" Oct 01 13:05:06 crc kubenswrapper[4851]: I1001 13:05:06.705952 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/127a1393-40c1-4ee6-84a6-9ccd6dee9595-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8\" (UID: \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" Oct 01 13:05:06 crc kubenswrapper[4851]: I1001 13:05:06.706073 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/127a1393-40c1-4ee6-84a6-9ccd6dee9595-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8\" (UID: \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" Oct 01 13:05:06 crc kubenswrapper[4851]: I1001 13:05:06.807154 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz99q\" (UniqueName: \"kubernetes.io/projected/127a1393-40c1-4ee6-84a6-9ccd6dee9595-kube-api-access-xz99q\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8\" (UID: \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" Oct 01 13:05:06 crc kubenswrapper[4851]: I1001 13:05:06.807371 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/127a1393-40c1-4ee6-84a6-9ccd6dee9595-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8\" (UID: \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" Oct 01 13:05:06 crc kubenswrapper[4851]: I1001 13:05:06.807413 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/127a1393-40c1-4ee6-84a6-9ccd6dee9595-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8\" (UID: \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" Oct 01 13:05:06 crc kubenswrapper[4851]: I1001 13:05:06.808210 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/127a1393-40c1-4ee6-84a6-9ccd6dee9595-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8\" (UID: \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" Oct 01 13:05:06 crc kubenswrapper[4851]: I1001 13:05:06.808433 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/127a1393-40c1-4ee6-84a6-9ccd6dee9595-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8\" (UID: \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" Oct 01 13:05:06 crc kubenswrapper[4851]: I1001 13:05:06.827530 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz99q\" (UniqueName: \"kubernetes.io/projected/127a1393-40c1-4ee6-84a6-9ccd6dee9595-kube-api-access-xz99q\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8\" (UID: \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" Oct 01 13:05:06 crc kubenswrapper[4851]: I1001 13:05:06.862737 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" Oct 01 13:05:07 crc kubenswrapper[4851]: I1001 13:05:07.085549 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8"] Oct 01 13:05:07 crc kubenswrapper[4851]: I1001 13:05:07.790928 4851 generic.go:334] "Generic (PLEG): container finished" podID="127a1393-40c1-4ee6-84a6-9ccd6dee9595" containerID="3254bc51a3ee83120dfd96c7c7b4521d2f85717d177c2bf61e59fe077b6a1507" exitCode=0 Oct 01 13:05:07 crc kubenswrapper[4851]: I1001 13:05:07.791008 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" event={"ID":"127a1393-40c1-4ee6-84a6-9ccd6dee9595","Type":"ContainerDied","Data":"3254bc51a3ee83120dfd96c7c7b4521d2f85717d177c2bf61e59fe077b6a1507"} Oct 01 13:05:07 crc kubenswrapper[4851]: I1001 13:05:07.791267 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" event={"ID":"127a1393-40c1-4ee6-84a6-9ccd6dee9595","Type":"ContainerStarted","Data":"591938745e325e5fe22e46ecd9994c1039f16a7506e33d19ce70c4599838eac1"} Oct 01 13:05:10 crc kubenswrapper[4851]: I1001 13:05:10.829833 4851 generic.go:334] "Generic (PLEG): container finished" podID="127a1393-40c1-4ee6-84a6-9ccd6dee9595" containerID="139d7d5e6a8cbc21e48bcf9ec7e6bdd3ad2086d7ecd2951f6836a15afd0eabda" exitCode=0 Oct 01 13:05:10 crc kubenswrapper[4851]: I1001 13:05:10.829917 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" event={"ID":"127a1393-40c1-4ee6-84a6-9ccd6dee9595","Type":"ContainerDied","Data":"139d7d5e6a8cbc21e48bcf9ec7e6bdd3ad2086d7ecd2951f6836a15afd0eabda"} Oct 01 13:05:11 crc kubenswrapper[4851]: I1001 13:05:11.843488 4851 generic.go:334] "Generic (PLEG): container finished" podID="127a1393-40c1-4ee6-84a6-9ccd6dee9595" containerID="33affaae9f9a87afbb27eabb6f1d21961257a422732ddbf69b7b502781456561" exitCode=0 Oct 01 13:05:11 crc kubenswrapper[4851]: I1001 13:05:11.843611 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" event={"ID":"127a1393-40c1-4ee6-84a6-9ccd6dee9595","Type":"ContainerDied","Data":"33affaae9f9a87afbb27eabb6f1d21961257a422732ddbf69b7b502781456561"} Oct 01 13:05:13 crc kubenswrapper[4851]: I1001 13:05:13.110486 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" Oct 01 13:05:13 crc kubenswrapper[4851]: I1001 13:05:13.206881 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz99q\" (UniqueName: \"kubernetes.io/projected/127a1393-40c1-4ee6-84a6-9ccd6dee9595-kube-api-access-xz99q\") pod \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\" (UID: \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\") " Oct 01 13:05:13 crc kubenswrapper[4851]: I1001 13:05:13.206937 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/127a1393-40c1-4ee6-84a6-9ccd6dee9595-util\") pod \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\" (UID: \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\") " Oct 01 13:05:13 crc kubenswrapper[4851]: I1001 13:05:13.206970 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/127a1393-40c1-4ee6-84a6-9ccd6dee9595-bundle\") pod \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\" (UID: \"127a1393-40c1-4ee6-84a6-9ccd6dee9595\") " Oct 01 13:05:13 crc kubenswrapper[4851]: I1001 13:05:13.207826 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/127a1393-40c1-4ee6-84a6-9ccd6dee9595-bundle" (OuterVolumeSpecName: "bundle") pod "127a1393-40c1-4ee6-84a6-9ccd6dee9595" (UID: "127a1393-40c1-4ee6-84a6-9ccd6dee9595"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:05:13 crc kubenswrapper[4851]: I1001 13:05:13.217626 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/127a1393-40c1-4ee6-84a6-9ccd6dee9595-util" (OuterVolumeSpecName: "util") pod "127a1393-40c1-4ee6-84a6-9ccd6dee9595" (UID: "127a1393-40c1-4ee6-84a6-9ccd6dee9595"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:05:13 crc kubenswrapper[4851]: I1001 13:05:13.219119 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/127a1393-40c1-4ee6-84a6-9ccd6dee9595-kube-api-access-xz99q" (OuterVolumeSpecName: "kube-api-access-xz99q") pod "127a1393-40c1-4ee6-84a6-9ccd6dee9595" (UID: "127a1393-40c1-4ee6-84a6-9ccd6dee9595"). InnerVolumeSpecName "kube-api-access-xz99q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:05:13 crc kubenswrapper[4851]: I1001 13:05:13.309964 4851 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/127a1393-40c1-4ee6-84a6-9ccd6dee9595-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:13 crc kubenswrapper[4851]: I1001 13:05:13.310026 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz99q\" (UniqueName: \"kubernetes.io/projected/127a1393-40c1-4ee6-84a6-9ccd6dee9595-kube-api-access-xz99q\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:13 crc kubenswrapper[4851]: I1001 13:05:13.310047 4851 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/127a1393-40c1-4ee6-84a6-9ccd6dee9595-util\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:13 crc kubenswrapper[4851]: I1001 13:05:13.858927 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" event={"ID":"127a1393-40c1-4ee6-84a6-9ccd6dee9595","Type":"ContainerDied","Data":"591938745e325e5fe22e46ecd9994c1039f16a7506e33d19ce70c4599838eac1"} Oct 01 13:05:13 crc kubenswrapper[4851]: I1001 13:05:13.859002 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="591938745e325e5fe22e46ecd9994c1039f16a7506e33d19ce70c4599838eac1" Oct 01 13:05:13 crc kubenswrapper[4851]: I1001 13:05:13.859032 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8" Oct 01 13:05:17 crc kubenswrapper[4851]: I1001 13:05:17.957203 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-p2vrr"] Oct 01 13:05:17 crc kubenswrapper[4851]: E1001 13:05:17.957883 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127a1393-40c1-4ee6-84a6-9ccd6dee9595" containerName="util" Oct 01 13:05:17 crc kubenswrapper[4851]: I1001 13:05:17.957903 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="127a1393-40c1-4ee6-84a6-9ccd6dee9595" containerName="util" Oct 01 13:05:17 crc kubenswrapper[4851]: E1001 13:05:17.957923 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127a1393-40c1-4ee6-84a6-9ccd6dee9595" containerName="pull" Oct 01 13:05:17 crc kubenswrapper[4851]: I1001 13:05:17.957936 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="127a1393-40c1-4ee6-84a6-9ccd6dee9595" containerName="pull" Oct 01 13:05:17 crc kubenswrapper[4851]: E1001 13:05:17.957967 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127a1393-40c1-4ee6-84a6-9ccd6dee9595" containerName="extract" Oct 01 13:05:17 crc kubenswrapper[4851]: I1001 13:05:17.957980 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="127a1393-40c1-4ee6-84a6-9ccd6dee9595" containerName="extract" Oct 01 13:05:17 crc kubenswrapper[4851]: I1001 13:05:17.958169 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="127a1393-40c1-4ee6-84a6-9ccd6dee9595" containerName="extract" Oct 01 13:05:17 crc kubenswrapper[4851]: I1001 13:05:17.958862 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-p2vrr" Oct 01 13:05:17 crc kubenswrapper[4851]: I1001 13:05:17.960230 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-ztxgm" Oct 01 13:05:17 crc kubenswrapper[4851]: I1001 13:05:17.960515 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 01 13:05:17 crc kubenswrapper[4851]: I1001 13:05:17.961550 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 01 13:05:17 crc kubenswrapper[4851]: I1001 13:05:17.966076 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-p2vrr"] Oct 01 13:05:18 crc kubenswrapper[4851]: I1001 13:05:18.125290 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp95l\" (UniqueName: \"kubernetes.io/projected/92d2d84c-4ae5-4943-9eeb-f88f900ec658-kube-api-access-sp95l\") pod \"nmstate-operator-5d6f6cfd66-p2vrr\" (UID: \"92d2d84c-4ae5-4943-9eeb-f88f900ec658\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-p2vrr" Oct 01 13:05:18 crc kubenswrapper[4851]: I1001 13:05:18.226189 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp95l\" (UniqueName: \"kubernetes.io/projected/92d2d84c-4ae5-4943-9eeb-f88f900ec658-kube-api-access-sp95l\") pod \"nmstate-operator-5d6f6cfd66-p2vrr\" (UID: \"92d2d84c-4ae5-4943-9eeb-f88f900ec658\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-p2vrr" Oct 01 13:05:18 crc kubenswrapper[4851]: I1001 13:05:18.244745 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp95l\" (UniqueName: \"kubernetes.io/projected/92d2d84c-4ae5-4943-9eeb-f88f900ec658-kube-api-access-sp95l\") pod \"nmstate-operator-5d6f6cfd66-p2vrr\" (UID: \"92d2d84c-4ae5-4943-9eeb-f88f900ec658\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-p2vrr" Oct 01 13:05:18 crc kubenswrapper[4851]: I1001 13:05:18.325223 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-p2vrr" Oct 01 13:05:18 crc kubenswrapper[4851]: I1001 13:05:18.532749 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-p2vrr"] Oct 01 13:05:18 crc kubenswrapper[4851]: W1001 13:05:18.538682 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d2d84c_4ae5_4943_9eeb_f88f900ec658.slice/crio-cdd88c5695ee3c2c51e119235c697de0008f94126f8f335776bbd553118ed494 WatchSource:0}: Error finding container cdd88c5695ee3c2c51e119235c697de0008f94126f8f335776bbd553118ed494: Status 404 returned error can't find the container with id cdd88c5695ee3c2c51e119235c697de0008f94126f8f335776bbd553118ed494 Oct 01 13:05:18 crc kubenswrapper[4851]: I1001 13:05:18.899858 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-p2vrr" event={"ID":"92d2d84c-4ae5-4943-9eeb-f88f900ec658","Type":"ContainerStarted","Data":"cdd88c5695ee3c2c51e119235c697de0008f94126f8f335776bbd553118ed494"} Oct 01 13:05:22 crc kubenswrapper[4851]: I1001 13:05:22.934123 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-p2vrr" event={"ID":"92d2d84c-4ae5-4943-9eeb-f88f900ec658","Type":"ContainerStarted","Data":"a2015bf8f4c016d248f6a79989702bfd020889bd749612c7635a02064fc90b93"} Oct 01 13:05:22 crc kubenswrapper[4851]: I1001 13:05:22.957519 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-p2vrr" podStartSLOduration=2.412059555 podStartE2EDuration="5.957478805s" podCreationTimestamp="2025-10-01 13:05:17 +0000 UTC" firstStartedPulling="2025-10-01 13:05:18.540711337 +0000 UTC m=+726.885828823" lastFinishedPulling="2025-10-01 13:05:22.086130547 +0000 UTC m=+730.431248073" observedRunningTime="2025-10-01 13:05:22.955737684 +0000 UTC m=+731.300855200" watchObservedRunningTime="2025-10-01 13:05:22.957478805 +0000 UTC m=+731.302596301" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.782268 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-2bcp8"] Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.784427 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-2bcp8" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.787372 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-zpvld"] Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.788303 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-zpvld" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.795475 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-jtj27" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.801957 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-2bcp8"] Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.806950 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.807299 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-zpvld"] Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.812328 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-z4ptw"] Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.814198 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.860632 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fccb3980-5aa4-4221-8f25-c14c68913a81-dbus-socket\") pod \"nmstate-handler-z4ptw\" (UID: \"fccb3980-5aa4-4221-8f25-c14c68913a81\") " pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.860691 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrclf\" (UniqueName: \"kubernetes.io/projected/f4752bf8-2355-4922-92d2-5546fd2c4340-kube-api-access-lrclf\") pod \"nmstate-webhook-6d689559c5-zpvld\" (UID: \"f4752bf8-2355-4922-92d2-5546fd2c4340\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-zpvld" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.860735 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f4752bf8-2355-4922-92d2-5546fd2c4340-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-zpvld\" (UID: \"f4752bf8-2355-4922-92d2-5546fd2c4340\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-zpvld" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.860768 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfzj2\" (UniqueName: \"kubernetes.io/projected/94133d01-7539-4ee5-9151-62f52ec7a1e8-kube-api-access-pfzj2\") pod \"nmstate-metrics-58fcddf996-2bcp8\" (UID: \"94133d01-7539-4ee5-9151-62f52ec7a1e8\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-2bcp8" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.860805 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fccb3980-5aa4-4221-8f25-c14c68913a81-ovs-socket\") pod \"nmstate-handler-z4ptw\" (UID: \"fccb3980-5aa4-4221-8f25-c14c68913a81\") " pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.860827 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgwwf\" (UniqueName: \"kubernetes.io/projected/fccb3980-5aa4-4221-8f25-c14c68913a81-kube-api-access-qgwwf\") pod \"nmstate-handler-z4ptw\" (UID: \"fccb3980-5aa4-4221-8f25-c14c68913a81\") " pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.860866 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fccb3980-5aa4-4221-8f25-c14c68913a81-nmstate-lock\") pod \"nmstate-handler-z4ptw\" (UID: \"fccb3980-5aa4-4221-8f25-c14c68913a81\") " pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.909787 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg"] Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.910404 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.912081 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.912995 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.914228 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-p5cl2" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.924035 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg"] Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.962000 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrclf\" (UniqueName: \"kubernetes.io/projected/f4752bf8-2355-4922-92d2-5546fd2c4340-kube-api-access-lrclf\") pod \"nmstate-webhook-6d689559c5-zpvld\" (UID: \"f4752bf8-2355-4922-92d2-5546fd2c4340\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-zpvld" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.962072 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f4752bf8-2355-4922-92d2-5546fd2c4340-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-zpvld\" (UID: \"f4752bf8-2355-4922-92d2-5546fd2c4340\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-zpvld" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.962117 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfzj2\" (UniqueName: \"kubernetes.io/projected/94133d01-7539-4ee5-9151-62f52ec7a1e8-kube-api-access-pfzj2\") pod \"nmstate-metrics-58fcddf996-2bcp8\" (UID: \"94133d01-7539-4ee5-9151-62f52ec7a1e8\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-2bcp8" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.962164 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/874ffc52-8446-4804-baa4-b75d2348af0d-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-9kpmg\" (UID: \"874ffc52-8446-4804-baa4-b75d2348af0d\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.962216 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fccb3980-5aa4-4221-8f25-c14c68913a81-ovs-socket\") pod \"nmstate-handler-z4ptw\" (UID: \"fccb3980-5aa4-4221-8f25-c14c68913a81\") " pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.962242 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgwwf\" (UniqueName: \"kubernetes.io/projected/fccb3980-5aa4-4221-8f25-c14c68913a81-kube-api-access-qgwwf\") pod \"nmstate-handler-z4ptw\" (UID: \"fccb3980-5aa4-4221-8f25-c14c68913a81\") " pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.962286 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fccb3980-5aa4-4221-8f25-c14c68913a81-ovs-socket\") pod \"nmstate-handler-z4ptw\" (UID: \"fccb3980-5aa4-4221-8f25-c14c68913a81\") " pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.962302 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/874ffc52-8446-4804-baa4-b75d2348af0d-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-9kpmg\" (UID: \"874ffc52-8446-4804-baa4-b75d2348af0d\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.962758 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fccb3980-5aa4-4221-8f25-c14c68913a81-nmstate-lock\") pod \"nmstate-handler-z4ptw\" (UID: \"fccb3980-5aa4-4221-8f25-c14c68913a81\") " pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.962838 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fccb3980-5aa4-4221-8f25-c14c68913a81-nmstate-lock\") pod \"nmstate-handler-z4ptw\" (UID: \"fccb3980-5aa4-4221-8f25-c14c68913a81\") " pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.962904 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fccb3980-5aa4-4221-8f25-c14c68913a81-dbus-socket\") pod \"nmstate-handler-z4ptw\" (UID: \"fccb3980-5aa4-4221-8f25-c14c68913a81\") " pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.963201 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fccb3980-5aa4-4221-8f25-c14c68913a81-dbus-socket\") pod \"nmstate-handler-z4ptw\" (UID: \"fccb3980-5aa4-4221-8f25-c14c68913a81\") " pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.962934 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r76m6\" (UniqueName: \"kubernetes.io/projected/874ffc52-8446-4804-baa4-b75d2348af0d-kube-api-access-r76m6\") pod \"nmstate-console-plugin-864bb6dfb5-9kpmg\" (UID: \"874ffc52-8446-4804-baa4-b75d2348af0d\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.968554 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f4752bf8-2355-4922-92d2-5546fd2c4340-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-zpvld\" (UID: \"f4752bf8-2355-4922-92d2-5546fd2c4340\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-zpvld" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.978251 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgwwf\" (UniqueName: \"kubernetes.io/projected/fccb3980-5aa4-4221-8f25-c14c68913a81-kube-api-access-qgwwf\") pod \"nmstate-handler-z4ptw\" (UID: \"fccb3980-5aa4-4221-8f25-c14c68913a81\") " pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.982238 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfzj2\" (UniqueName: \"kubernetes.io/projected/94133d01-7539-4ee5-9151-62f52ec7a1e8-kube-api-access-pfzj2\") pod \"nmstate-metrics-58fcddf996-2bcp8\" (UID: \"94133d01-7539-4ee5-9151-62f52ec7a1e8\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-2bcp8" Oct 01 13:05:27 crc kubenswrapper[4851]: I1001 13:05:27.985026 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrclf\" (UniqueName: \"kubernetes.io/projected/f4752bf8-2355-4922-92d2-5546fd2c4340-kube-api-access-lrclf\") pod \"nmstate-webhook-6d689559c5-zpvld\" (UID: \"f4752bf8-2355-4922-92d2-5546fd2c4340\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-zpvld" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.064454 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r76m6\" (UniqueName: \"kubernetes.io/projected/874ffc52-8446-4804-baa4-b75d2348af0d-kube-api-access-r76m6\") pod \"nmstate-console-plugin-864bb6dfb5-9kpmg\" (UID: \"874ffc52-8446-4804-baa4-b75d2348af0d\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.064570 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/874ffc52-8446-4804-baa4-b75d2348af0d-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-9kpmg\" (UID: \"874ffc52-8446-4804-baa4-b75d2348af0d\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.064602 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/874ffc52-8446-4804-baa4-b75d2348af0d-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-9kpmg\" (UID: \"874ffc52-8446-4804-baa4-b75d2348af0d\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.065695 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/874ffc52-8446-4804-baa4-b75d2348af0d-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-9kpmg\" (UID: \"874ffc52-8446-4804-baa4-b75d2348af0d\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.071016 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/874ffc52-8446-4804-baa4-b75d2348af0d-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-9kpmg\" (UID: \"874ffc52-8446-4804-baa4-b75d2348af0d\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.083680 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r76m6\" (UniqueName: \"kubernetes.io/projected/874ffc52-8446-4804-baa4-b75d2348af0d-kube-api-access-r76m6\") pod \"nmstate-console-plugin-864bb6dfb5-9kpmg\" (UID: \"874ffc52-8446-4804-baa4-b75d2348af0d\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.106510 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-68f4ff5c48-fgjpf"] Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.107151 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.117221 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68f4ff5c48-fgjpf"] Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.120102 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-2bcp8" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.133443 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-zpvld" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.151206 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.165942 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4ada12-d469-4342-a28f-164a8aef8142-console-serving-cert\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.166004 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jd85\" (UniqueName: \"kubernetes.io/projected/4c4ada12-d469-4342-a28f-164a8aef8142-kube-api-access-7jd85\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.166036 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c4ada12-d469-4342-a28f-164a8aef8142-trusted-ca-bundle\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.166060 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c4ada12-d469-4342-a28f-164a8aef8142-oauth-serving-cert\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.166094 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c4ada12-d469-4342-a28f-164a8aef8142-console-oauth-config\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.166111 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c4ada12-d469-4342-a28f-164a8aef8142-console-config\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.166130 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c4ada12-d469-4342-a28f-164a8aef8142-service-ca\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.225617 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.297780 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4ada12-d469-4342-a28f-164a8aef8142-console-serving-cert\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.297858 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jd85\" (UniqueName: \"kubernetes.io/projected/4c4ada12-d469-4342-a28f-164a8aef8142-kube-api-access-7jd85\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.297901 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c4ada12-d469-4342-a28f-164a8aef8142-trusted-ca-bundle\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.297937 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c4ada12-d469-4342-a28f-164a8aef8142-oauth-serving-cert\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.298463 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c4ada12-d469-4342-a28f-164a8aef8142-console-oauth-config\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.298490 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c4ada12-d469-4342-a28f-164a8aef8142-console-config\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.298533 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c4ada12-d469-4342-a28f-164a8aef8142-service-ca\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.299555 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c4ada12-d469-4342-a28f-164a8aef8142-service-ca\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.299757 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c4ada12-d469-4342-a28f-164a8aef8142-oauth-serving-cert\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.300784 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c4ada12-d469-4342-a28f-164a8aef8142-trusted-ca-bundle\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.301317 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c4ada12-d469-4342-a28f-164a8aef8142-console-config\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.303937 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c4ada12-d469-4342-a28f-164a8aef8142-console-oauth-config\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.315618 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4ada12-d469-4342-a28f-164a8aef8142-console-serving-cert\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.316657 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jd85\" (UniqueName: \"kubernetes.io/projected/4c4ada12-d469-4342-a28f-164a8aef8142-kube-api-access-7jd85\") pod \"console-68f4ff5c48-fgjpf\" (UID: \"4c4ada12-d469-4342-a28f-164a8aef8142\") " pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.423468 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.608206 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-2bcp8"] Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.613258 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68f4ff5c48-fgjpf"] Oct 01 13:05:28 crc kubenswrapper[4851]: W1001 13:05:28.620926 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c4ada12_d469_4342_a28f_164a8aef8142.slice/crio-eb561e24169b1c94ae78d39752b71c373ab5abd2e6ebf860be93e786195d4709 WatchSource:0}: Error finding container eb561e24169b1c94ae78d39752b71c373ab5abd2e6ebf860be93e786195d4709: Status 404 returned error can't find the container with id eb561e24169b1c94ae78d39752b71c373ab5abd2e6ebf860be93e786195d4709 Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.638301 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-zpvld"] Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.641035 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg"] Oct 01 13:05:28 crc kubenswrapper[4851]: W1001 13:05:28.645746 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod874ffc52_8446_4804_baa4_b75d2348af0d.slice/crio-418d712a0df69c9dc87d2d41d06c976e0b5f69a6dcdf303d678a6f98539bb380 WatchSource:0}: Error finding container 418d712a0df69c9dc87d2d41d06c976e0b5f69a6dcdf303d678a6f98539bb380: Status 404 returned error can't find the container with id 418d712a0df69c9dc87d2d41d06c976e0b5f69a6dcdf303d678a6f98539bb380 Oct 01 13:05:28 crc kubenswrapper[4851]: W1001 13:05:28.647674 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4752bf8_2355_4922_92d2_5546fd2c4340.slice/crio-0d98f4d830a539b36883ff85e3deb902e4303d84416b78fbc3ead00df7b4e1cb WatchSource:0}: Error finding container 0d98f4d830a539b36883ff85e3deb902e4303d84416b78fbc3ead00df7b4e1cb: Status 404 returned error can't find the container with id 0d98f4d830a539b36883ff85e3deb902e4303d84416b78fbc3ead00df7b4e1cb Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.977753 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f4ff5c48-fgjpf" event={"ID":"4c4ada12-d469-4342-a28f-164a8aef8142","Type":"ContainerStarted","Data":"b4a0323ae7c51a6b6c79650790e9d7b4f1679fc8eb3e57414a92aa2e834859f1"} Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.977834 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f4ff5c48-fgjpf" event={"ID":"4c4ada12-d469-4342-a28f-164a8aef8142","Type":"ContainerStarted","Data":"eb561e24169b1c94ae78d39752b71c373ab5abd2e6ebf860be93e786195d4709"} Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.979159 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-z4ptw" event={"ID":"fccb3980-5aa4-4221-8f25-c14c68913a81","Type":"ContainerStarted","Data":"8b38df2d6110a60c15b1133690903fbd8389bfc225f8b405959bf70037bde72f"} Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.980439 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-2bcp8" event={"ID":"94133d01-7539-4ee5-9151-62f52ec7a1e8","Type":"ContainerStarted","Data":"ea6ee190fe80cdfb24c211534220d1f5639b6c8cbbc1918f37568656399537a4"} Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.982645 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg" event={"ID":"874ffc52-8446-4804-baa4-b75d2348af0d","Type":"ContainerStarted","Data":"418d712a0df69c9dc87d2d41d06c976e0b5f69a6dcdf303d678a6f98539bb380"} Oct 01 13:05:28 crc kubenswrapper[4851]: I1001 13:05:28.984322 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-zpvld" event={"ID":"f4752bf8-2355-4922-92d2-5546fd2c4340","Type":"ContainerStarted","Data":"0d98f4d830a539b36883ff85e3deb902e4303d84416b78fbc3ead00df7b4e1cb"} Oct 01 13:05:29 crc kubenswrapper[4851]: I1001 13:05:29.006402 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68f4ff5c48-fgjpf" podStartSLOduration=1.006376363 podStartE2EDuration="1.006376363s" podCreationTimestamp="2025-10-01 13:05:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:05:29.001390638 +0000 UTC m=+737.346508154" watchObservedRunningTime="2025-10-01 13:05:29.006376363 +0000 UTC m=+737.351493889" Oct 01 13:05:30 crc kubenswrapper[4851]: I1001 13:05:30.050114 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:05:30 crc kubenswrapper[4851]: I1001 13:05:30.050202 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:05:34 crc kubenswrapper[4851]: I1001 13:05:34.016753 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-z4ptw" event={"ID":"fccb3980-5aa4-4221-8f25-c14c68913a81","Type":"ContainerStarted","Data":"4e0255c3d287688ca6b5c2eb7d056cbcc59398b924ac060994a1dae218bf3feb"} Oct 01 13:05:34 crc kubenswrapper[4851]: I1001 13:05:34.017257 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:34 crc kubenswrapper[4851]: I1001 13:05:34.019363 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-2bcp8" event={"ID":"94133d01-7539-4ee5-9151-62f52ec7a1e8","Type":"ContainerStarted","Data":"65ff0bcc89e2c4faa3a053bfd8ab50b22ddcabcc190d0a5b19bf9f42f08d8871"} Oct 01 13:05:34 crc kubenswrapper[4851]: I1001 13:05:34.021133 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg" event={"ID":"874ffc52-8446-4804-baa4-b75d2348af0d","Type":"ContainerStarted","Data":"060960639eff874e549e0ffcb4b30e8fa25edae9a59ae8427566f5cd1bd30b77"} Oct 01 13:05:34 crc kubenswrapper[4851]: I1001 13:05:34.022769 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-zpvld" event={"ID":"f4752bf8-2355-4922-92d2-5546fd2c4340","Type":"ContainerStarted","Data":"5a34047302083c55bc2150c4afb21436b184c68c2ed6ba203683057bd43ca8a1"} Oct 01 13:05:34 crc kubenswrapper[4851]: I1001 13:05:34.022939 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-zpvld" Oct 01 13:05:34 crc kubenswrapper[4851]: I1001 13:05:34.042588 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-z4ptw" podStartSLOduration=2.128483881 podStartE2EDuration="7.042565698s" podCreationTimestamp="2025-10-01 13:05:27 +0000 UTC" firstStartedPulling="2025-10-01 13:05:28.184940691 +0000 UTC m=+736.530058177" lastFinishedPulling="2025-10-01 13:05:33.099022468 +0000 UTC m=+741.444139994" observedRunningTime="2025-10-01 13:05:34.039250132 +0000 UTC m=+742.384367668" watchObservedRunningTime="2025-10-01 13:05:34.042565698 +0000 UTC m=+742.387683194" Oct 01 13:05:34 crc kubenswrapper[4851]: I1001 13:05:34.062975 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-zpvld" podStartSLOduration=2.587171962 podStartE2EDuration="7.062923388s" podCreationTimestamp="2025-10-01 13:05:27 +0000 UTC" firstStartedPulling="2025-10-01 13:05:28.650284525 +0000 UTC m=+736.995402011" lastFinishedPulling="2025-10-01 13:05:33.126035911 +0000 UTC m=+741.471153437" observedRunningTime="2025-10-01 13:05:34.062238548 +0000 UTC m=+742.407356084" watchObservedRunningTime="2025-10-01 13:05:34.062923388 +0000 UTC m=+742.408040894" Oct 01 13:05:34 crc kubenswrapper[4851]: I1001 13:05:34.085452 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kpmg" podStartSLOduration=2.82488571 podStartE2EDuration="7.08542569s" podCreationTimestamp="2025-10-01 13:05:27 +0000 UTC" firstStartedPulling="2025-10-01 13:05:28.64804346 +0000 UTC m=+736.993160946" lastFinishedPulling="2025-10-01 13:05:32.90858344 +0000 UTC m=+741.253700926" observedRunningTime="2025-10-01 13:05:34.08093713 +0000 UTC m=+742.426054626" watchObservedRunningTime="2025-10-01 13:05:34.08542569 +0000 UTC m=+742.430543186" Oct 01 13:05:36 crc kubenswrapper[4851]: I1001 13:05:36.891648 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k7rvb"] Oct 01 13:05:36 crc kubenswrapper[4851]: I1001 13:05:36.892321 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" podUID="f6538b69-30fc-4cc0-80da-62537b61f41f" containerName="controller-manager" containerID="cri-o://31fcf8212060db1c25000d2426518f27760b791f3b2b30ab82bb76ec52d29229" gracePeriod=30 Oct 01 13:05:36 crc kubenswrapper[4851]: I1001 13:05:36.938542 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd"] Oct 01 13:05:36 crc kubenswrapper[4851]: I1001 13:05:36.938814 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" podUID="c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b" containerName="route-controller-manager" containerID="cri-o://02ea592b1b4b4189c8ed849e2d29c1a8f0e0d9c2341a6bcc56e43c45e0937d41" gracePeriod=30 Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.044765 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-2bcp8" event={"ID":"94133d01-7539-4ee5-9151-62f52ec7a1e8","Type":"ContainerStarted","Data":"8ac866926f83b503512d8a2ad9353879761475cadac435f7c7b87c0df160f3a6"} Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.046180 4851 generic.go:334] "Generic (PLEG): container finished" podID="f6538b69-30fc-4cc0-80da-62537b61f41f" containerID="31fcf8212060db1c25000d2426518f27760b791f3b2b30ab82bb76ec52d29229" exitCode=0 Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.046206 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" event={"ID":"f6538b69-30fc-4cc0-80da-62537b61f41f","Type":"ContainerDied","Data":"31fcf8212060db1c25000d2426518f27760b791f3b2b30ab82bb76ec52d29229"} Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.060965 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-2bcp8" podStartSLOduration=2.356175458 podStartE2EDuration="10.060949036s" podCreationTimestamp="2025-10-01 13:05:27 +0000 UTC" firstStartedPulling="2025-10-01 13:05:28.617167685 +0000 UTC m=+736.962285171" lastFinishedPulling="2025-10-01 13:05:36.321941243 +0000 UTC m=+744.667058749" observedRunningTime="2025-10-01 13:05:37.059627968 +0000 UTC m=+745.404745454" watchObservedRunningTime="2025-10-01 13:05:37.060949036 +0000 UTC m=+745.406066522" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.456019 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.515693 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.622231 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-client-ca\") pod \"f6538b69-30fc-4cc0-80da-62537b61f41f\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.622307 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-serving-cert\") pod \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.622348 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-config\") pod \"f6538b69-30fc-4cc0-80da-62537b61f41f\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.622387 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhgmn\" (UniqueName: \"kubernetes.io/projected/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-kube-api-access-zhgmn\") pod \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.622410 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6538b69-30fc-4cc0-80da-62537b61f41f-serving-cert\") pod \"f6538b69-30fc-4cc0-80da-62537b61f41f\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.622443 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-proxy-ca-bundles\") pod \"f6538b69-30fc-4cc0-80da-62537b61f41f\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.622477 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-client-ca\") pod \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.622547 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7fss\" (UniqueName: \"kubernetes.io/projected/f6538b69-30fc-4cc0-80da-62537b61f41f-kube-api-access-g7fss\") pod \"f6538b69-30fc-4cc0-80da-62537b61f41f\" (UID: \"f6538b69-30fc-4cc0-80da-62537b61f41f\") " Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.622584 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-config\") pod \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\" (UID: \"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b\") " Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.623199 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-client-ca" (OuterVolumeSpecName: "client-ca") pod "c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b" (UID: "c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.623476 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-config" (OuterVolumeSpecName: "config") pod "c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b" (UID: "c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.623489 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-config" (OuterVolumeSpecName: "config") pod "f6538b69-30fc-4cc0-80da-62537b61f41f" (UID: "f6538b69-30fc-4cc0-80da-62537b61f41f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.623515 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f6538b69-30fc-4cc0-80da-62537b61f41f" (UID: "f6538b69-30fc-4cc0-80da-62537b61f41f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.623562 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-client-ca" (OuterVolumeSpecName: "client-ca") pod "f6538b69-30fc-4cc0-80da-62537b61f41f" (UID: "f6538b69-30fc-4cc0-80da-62537b61f41f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.628517 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6538b69-30fc-4cc0-80da-62537b61f41f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f6538b69-30fc-4cc0-80da-62537b61f41f" (UID: "f6538b69-30fc-4cc0-80da-62537b61f41f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.628630 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-kube-api-access-zhgmn" (OuterVolumeSpecName: "kube-api-access-zhgmn") pod "c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b" (UID: "c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b"). InnerVolumeSpecName "kube-api-access-zhgmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.629076 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6538b69-30fc-4cc0-80da-62537b61f41f-kube-api-access-g7fss" (OuterVolumeSpecName: "kube-api-access-g7fss") pod "f6538b69-30fc-4cc0-80da-62537b61f41f" (UID: "f6538b69-30fc-4cc0-80da-62537b61f41f"). InnerVolumeSpecName "kube-api-access-g7fss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.635115 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b" (UID: "c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.724439 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhgmn\" (UniqueName: \"kubernetes.io/projected/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-kube-api-access-zhgmn\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.724492 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6538b69-30fc-4cc0-80da-62537b61f41f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.724554 4851 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.724578 4851 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.724600 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7fss\" (UniqueName: \"kubernetes.io/projected/f6538b69-30fc-4cc0-80da-62537b61f41f-kube-api-access-g7fss\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.724622 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.724645 4851 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.724666 4851 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:37 crc kubenswrapper[4851]: I1001 13:05:37.724688 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6538b69-30fc-4cc0-80da-62537b61f41f-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.059446 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" event={"ID":"f6538b69-30fc-4cc0-80da-62537b61f41f","Type":"ContainerDied","Data":"86dc48ad22416747fe5700b8f1e98828d980b5468998601d85058293390039cb"} Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.059559 4851 scope.go:117] "RemoveContainer" containerID="31fcf8212060db1c25000d2426518f27760b791f3b2b30ab82bb76ec52d29229" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.059564 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k7rvb" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.062667 4851 generic.go:334] "Generic (PLEG): container finished" podID="c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b" containerID="02ea592b1b4b4189c8ed849e2d29c1a8f0e0d9c2341a6bcc56e43c45e0937d41" exitCode=0 Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.062741 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.062798 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" event={"ID":"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b","Type":"ContainerDied","Data":"02ea592b1b4b4189c8ed849e2d29c1a8f0e0d9c2341a6bcc56e43c45e0937d41"} Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.062861 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd" event={"ID":"c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b","Type":"ContainerDied","Data":"c50c62c2c54ce680d974806d475bcfca2774da14bc2449cc93f21e51cede59c2"} Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.092799 4851 scope.go:117] "RemoveContainer" containerID="02ea592b1b4b4189c8ed849e2d29c1a8f0e0d9c2341a6bcc56e43c45e0937d41" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.103605 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd"] Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.115442 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgkzd"] Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.125466 4851 scope.go:117] "RemoveContainer" containerID="02ea592b1b4b4189c8ed849e2d29c1a8f0e0d9c2341a6bcc56e43c45e0937d41" Oct 01 13:05:38 crc kubenswrapper[4851]: E1001 13:05:38.126117 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ea592b1b4b4189c8ed849e2d29c1a8f0e0d9c2341a6bcc56e43c45e0937d41\": container with ID starting with 02ea592b1b4b4189c8ed849e2d29c1a8f0e0d9c2341a6bcc56e43c45e0937d41 not found: ID does not exist" containerID="02ea592b1b4b4189c8ed849e2d29c1a8f0e0d9c2341a6bcc56e43c45e0937d41" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.126183 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ea592b1b4b4189c8ed849e2d29c1a8f0e0d9c2341a6bcc56e43c45e0937d41"} err="failed to get container status \"02ea592b1b4b4189c8ed849e2d29c1a8f0e0d9c2341a6bcc56e43c45e0937d41\": rpc error: code = NotFound desc = could not find container \"02ea592b1b4b4189c8ed849e2d29c1a8f0e0d9c2341a6bcc56e43c45e0937d41\": container with ID starting with 02ea592b1b4b4189c8ed849e2d29c1a8f0e0d9c2341a6bcc56e43c45e0937d41 not found: ID does not exist" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.128464 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k7rvb"] Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.131472 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k7rvb"] Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.170489 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6977d5cdb6-2llwk"] Oct 01 13:05:38 crc kubenswrapper[4851]: E1001 13:05:38.170790 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6538b69-30fc-4cc0-80da-62537b61f41f" containerName="controller-manager" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.170808 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6538b69-30fc-4cc0-80da-62537b61f41f" containerName="controller-manager" Oct 01 13:05:38 crc kubenswrapper[4851]: E1001 13:05:38.170837 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b" containerName="route-controller-manager" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.170846 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b" containerName="route-controller-manager" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.170990 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b" containerName="route-controller-manager" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.171011 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6538b69-30fc-4cc0-80da-62537b61f41f" containerName="controller-manager" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.171537 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.175298 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.175920 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.176274 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.176637 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.177584 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.178879 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.180895 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69"] Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.181586 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.185896 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.187360 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.188011 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.188111 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.188247 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.188259 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.188463 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.189795 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-z4ptw" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.194095 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6977d5cdb6-2llwk"] Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.203239 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69"] Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.335417 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/402f402e-0dc9-4501-b418-339e67f83617-serving-cert\") pod \"route-controller-manager-5dd4768cd-7zs69\" (UID: \"402f402e-0dc9-4501-b418-339e67f83617\") " pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.336630 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/402f402e-0dc9-4501-b418-339e67f83617-config\") pod \"route-controller-manager-5dd4768cd-7zs69\" (UID: \"402f402e-0dc9-4501-b418-339e67f83617\") " pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.336656 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bdbf\" (UniqueName: \"kubernetes.io/projected/402f402e-0dc9-4501-b418-339e67f83617-kube-api-access-9bdbf\") pod \"route-controller-manager-5dd4768cd-7zs69\" (UID: \"402f402e-0dc9-4501-b418-339e67f83617\") " pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.336458 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b" path="/var/lib/kubelet/pods/c72e3a6a-7fd1-4b47-8ade-a6b75c693b8b/volumes" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.336792 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ac20128-18c7-438a-a52d-5c1a8574d501-proxy-ca-bundles\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.336822 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/402f402e-0dc9-4501-b418-339e67f83617-client-ca\") pod \"route-controller-manager-5dd4768cd-7zs69\" (UID: \"402f402e-0dc9-4501-b418-339e67f83617\") " pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.336858 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac20128-18c7-438a-a52d-5c1a8574d501-config\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.336890 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ac20128-18c7-438a-a52d-5c1a8574d501-client-ca\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.336956 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktn8m\" (UniqueName: \"kubernetes.io/projected/3ac20128-18c7-438a-a52d-5c1a8574d501-kube-api-access-ktn8m\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.337005 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ac20128-18c7-438a-a52d-5c1a8574d501-serving-cert\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.337188 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6538b69-30fc-4cc0-80da-62537b61f41f" path="/var/lib/kubelet/pods/f6538b69-30fc-4cc0-80da-62537b61f41f/volumes" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.424689 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.424753 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.430611 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.438975 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac20128-18c7-438a-a52d-5c1a8574d501-config\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.439066 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ac20128-18c7-438a-a52d-5c1a8574d501-client-ca\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.439141 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktn8m\" (UniqueName: \"kubernetes.io/projected/3ac20128-18c7-438a-a52d-5c1a8574d501-kube-api-access-ktn8m\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.439263 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ac20128-18c7-438a-a52d-5c1a8574d501-serving-cert\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.439401 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/402f402e-0dc9-4501-b418-339e67f83617-serving-cert\") pod \"route-controller-manager-5dd4768cd-7zs69\" (UID: \"402f402e-0dc9-4501-b418-339e67f83617\") " pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.439457 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/402f402e-0dc9-4501-b418-339e67f83617-config\") pod \"route-controller-manager-5dd4768cd-7zs69\" (UID: \"402f402e-0dc9-4501-b418-339e67f83617\") " pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.441445 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ac20128-18c7-438a-a52d-5c1a8574d501-client-ca\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.441574 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac20128-18c7-438a-a52d-5c1a8574d501-config\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.442273 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/402f402e-0dc9-4501-b418-339e67f83617-config\") pod \"route-controller-manager-5dd4768cd-7zs69\" (UID: \"402f402e-0dc9-4501-b418-339e67f83617\") " pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.442386 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bdbf\" (UniqueName: \"kubernetes.io/projected/402f402e-0dc9-4501-b418-339e67f83617-kube-api-access-9bdbf\") pod \"route-controller-manager-5dd4768cd-7zs69\" (UID: \"402f402e-0dc9-4501-b418-339e67f83617\") " pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.442547 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ac20128-18c7-438a-a52d-5c1a8574d501-proxy-ca-bundles\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.442605 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/402f402e-0dc9-4501-b418-339e67f83617-client-ca\") pod \"route-controller-manager-5dd4768cd-7zs69\" (UID: \"402f402e-0dc9-4501-b418-339e67f83617\") " pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.443754 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/402f402e-0dc9-4501-b418-339e67f83617-client-ca\") pod \"route-controller-manager-5dd4768cd-7zs69\" (UID: \"402f402e-0dc9-4501-b418-339e67f83617\") " pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.445291 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ac20128-18c7-438a-a52d-5c1a8574d501-serving-cert\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.445557 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/402f402e-0dc9-4501-b418-339e67f83617-serving-cert\") pod \"route-controller-manager-5dd4768cd-7zs69\" (UID: \"402f402e-0dc9-4501-b418-339e67f83617\") " pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.445663 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ac20128-18c7-438a-a52d-5c1a8574d501-proxy-ca-bundles\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.460607 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bdbf\" (UniqueName: \"kubernetes.io/projected/402f402e-0dc9-4501-b418-339e67f83617-kube-api-access-9bdbf\") pod \"route-controller-manager-5dd4768cd-7zs69\" (UID: \"402f402e-0dc9-4501-b418-339e67f83617\") " pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.465937 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktn8m\" (UniqueName: \"kubernetes.io/projected/3ac20128-18c7-438a-a52d-5c1a8574d501-kube-api-access-ktn8m\") pod \"controller-manager-6977d5cdb6-2llwk\" (UID: \"3ac20128-18c7-438a-a52d-5c1a8574d501\") " pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.498086 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.504573 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.741995 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69"] Oct 01 13:05:38 crc kubenswrapper[4851]: W1001 13:05:38.749689 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod402f402e_0dc9_4501_b418_339e67f83617.slice/crio-61a36d12cf238d0df671a3364c3b36666d2404494fcb6922ace8e27da271e7ec WatchSource:0}: Error finding container 61a36d12cf238d0df671a3364c3b36666d2404494fcb6922ace8e27da271e7ec: Status 404 returned error can't find the container with id 61a36d12cf238d0df671a3364c3b36666d2404494fcb6922ace8e27da271e7ec Oct 01 13:05:38 crc kubenswrapper[4851]: I1001 13:05:38.788193 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6977d5cdb6-2llwk"] Oct 01 13:05:39 crc kubenswrapper[4851]: I1001 13:05:39.074177 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" event={"ID":"402f402e-0dc9-4501-b418-339e67f83617","Type":"ContainerStarted","Data":"2b468586d824178dfacd8ae57e9e4c1265dbaaa324e25fe07ec199ddc3d1cae6"} Oct 01 13:05:39 crc kubenswrapper[4851]: I1001 13:05:39.076630 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" event={"ID":"402f402e-0dc9-4501-b418-339e67f83617","Type":"ContainerStarted","Data":"61a36d12cf238d0df671a3364c3b36666d2404494fcb6922ace8e27da271e7ec"} Oct 01 13:05:39 crc kubenswrapper[4851]: I1001 13:05:39.076681 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:39 crc kubenswrapper[4851]: I1001 13:05:39.077124 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" event={"ID":"3ac20128-18c7-438a-a52d-5c1a8574d501","Type":"ContainerStarted","Data":"1db08dd0244f03c2c8d77871227bf601ff8e278f4ca1732431baefe7bbfebc17"} Oct 01 13:05:39 crc kubenswrapper[4851]: I1001 13:05:39.077147 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" event={"ID":"3ac20128-18c7-438a-a52d-5c1a8574d501","Type":"ContainerStarted","Data":"682cbbb01629d076d617684a41d374d067952962d9478be67ffc9cb4ef268d00"} Oct 01 13:05:39 crc kubenswrapper[4851]: I1001 13:05:39.077382 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:39 crc kubenswrapper[4851]: I1001 13:05:39.079383 4851 patch_prober.go:28] interesting pod/route-controller-manager-5dd4768cd-7zs69 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Oct 01 13:05:39 crc kubenswrapper[4851]: I1001 13:05:39.079423 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" podUID="402f402e-0dc9-4501-b418-339e67f83617" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Oct 01 13:05:39 crc kubenswrapper[4851]: I1001 13:05:39.080342 4851 patch_prober.go:28] interesting pod/controller-manager-6977d5cdb6-2llwk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Oct 01 13:05:39 crc kubenswrapper[4851]: I1001 13:05:39.080394 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" podUID="3ac20128-18c7-438a-a52d-5c1a8574d501" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Oct 01 13:05:39 crc kubenswrapper[4851]: I1001 13:05:39.082439 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68f4ff5c48-fgjpf" Oct 01 13:05:39 crc kubenswrapper[4851]: I1001 13:05:39.104035 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" podStartSLOduration=2.104022065 podStartE2EDuration="2.104022065s" podCreationTimestamp="2025-10-01 13:05:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:05:39.102199302 +0000 UTC m=+747.447316788" watchObservedRunningTime="2025-10-01 13:05:39.104022065 +0000 UTC m=+747.449139551" Oct 01 13:05:39 crc kubenswrapper[4851]: I1001 13:05:39.125663 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" podStartSLOduration=2.125643641 podStartE2EDuration="2.125643641s" podCreationTimestamp="2025-10-01 13:05:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:05:39.123763347 +0000 UTC m=+747.468880853" watchObservedRunningTime="2025-10-01 13:05:39.125643641 +0000 UTC m=+747.470761127" Oct 01 13:05:39 crc kubenswrapper[4851]: I1001 13:05:39.182862 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-xjvqh"] Oct 01 13:05:40 crc kubenswrapper[4851]: I1001 13:05:40.094978 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5dd4768cd-7zs69" Oct 01 13:05:40 crc kubenswrapper[4851]: I1001 13:05:40.099181 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6977d5cdb6-2llwk" Oct 01 13:05:46 crc kubenswrapper[4851]: I1001 13:05:46.304160 4851 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 13:05:48 crc kubenswrapper[4851]: I1001 13:05:48.144041 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-zpvld" Oct 01 13:06:00 crc kubenswrapper[4851]: I1001 13:06:00.050352 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:06:00 crc kubenswrapper[4851]: I1001 13:06:00.050997 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:06:00 crc kubenswrapper[4851]: I1001 13:06:00.051043 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 13:06:00 crc kubenswrapper[4851]: I1001 13:06:00.051585 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"288531af0e115f9595ac6f6f759c2572ba4e5c19461b4094fb567dd41bccf2dd"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:06:00 crc kubenswrapper[4851]: I1001 13:06:00.051642 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://288531af0e115f9595ac6f6f759c2572ba4e5c19461b4094fb567dd41bccf2dd" gracePeriod=600 Oct 01 13:06:00 crc kubenswrapper[4851]: I1001 13:06:00.276753 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="288531af0e115f9595ac6f6f759c2572ba4e5c19461b4094fb567dd41bccf2dd" exitCode=0 Oct 01 13:06:00 crc kubenswrapper[4851]: I1001 13:06:00.276827 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"288531af0e115f9595ac6f6f759c2572ba4e5c19461b4094fb567dd41bccf2dd"} Oct 01 13:06:00 crc kubenswrapper[4851]: I1001 13:06:00.276881 4851 scope.go:117] "RemoveContainer" containerID="df5480c0ce0978a756a190c53fc2c8d0701f9def62fec11373bfd7467e9dc90a" Oct 01 13:06:01 crc kubenswrapper[4851]: I1001 13:06:01.290230 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"be5d6b868e9238c5d4395c014452c5cfe7dc87bf6a9741e8af0bded2d6b25de6"} Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.236879 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-xjvqh" podUID="b5b5efe6-729a-431f-b8cd-67562ec18593" containerName="console" containerID="cri-o://4f661b194509e41fcefde57718e0f5f850d35928f579e9235d1e3621cab8ae53" gracePeriod=15 Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.823559 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-xjvqh_b5b5efe6-729a-431f-b8cd-67562ec18593/console/0.log" Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.824162 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.975339 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-service-ca\") pod \"b5b5efe6-729a-431f-b8cd-67562ec18593\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.975447 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-console-config\") pod \"b5b5efe6-729a-431f-b8cd-67562ec18593\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.975567 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-oauth-serving-cert\") pod \"b5b5efe6-729a-431f-b8cd-67562ec18593\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.975607 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkkqh\" (UniqueName: \"kubernetes.io/projected/b5b5efe6-729a-431f-b8cd-67562ec18593-kube-api-access-xkkqh\") pod \"b5b5efe6-729a-431f-b8cd-67562ec18593\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.975630 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-trusted-ca-bundle\") pod \"b5b5efe6-729a-431f-b8cd-67562ec18593\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.975671 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b5efe6-729a-431f-b8cd-67562ec18593-console-serving-cert\") pod \"b5b5efe6-729a-431f-b8cd-67562ec18593\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.975707 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5b5efe6-729a-431f-b8cd-67562ec18593-console-oauth-config\") pod \"b5b5efe6-729a-431f-b8cd-67562ec18593\" (UID: \"b5b5efe6-729a-431f-b8cd-67562ec18593\") " Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.976741 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-service-ca" (OuterVolumeSpecName: "service-ca") pod "b5b5efe6-729a-431f-b8cd-67562ec18593" (UID: "b5b5efe6-729a-431f-b8cd-67562ec18593"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.976986 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-console-config" (OuterVolumeSpecName: "console-config") pod "b5b5efe6-729a-431f-b8cd-67562ec18593" (UID: "b5b5efe6-729a-431f-b8cd-67562ec18593"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.977310 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b5b5efe6-729a-431f-b8cd-67562ec18593" (UID: "b5b5efe6-729a-431f-b8cd-67562ec18593"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.977702 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b5b5efe6-729a-431f-b8cd-67562ec18593" (UID: "b5b5efe6-729a-431f-b8cd-67562ec18593"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.985189 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b5efe6-729a-431f-b8cd-67562ec18593-kube-api-access-xkkqh" (OuterVolumeSpecName: "kube-api-access-xkkqh") pod "b5b5efe6-729a-431f-b8cd-67562ec18593" (UID: "b5b5efe6-729a-431f-b8cd-67562ec18593"). InnerVolumeSpecName "kube-api-access-xkkqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4851]: I1001 13:06:04.985227 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b5efe6-729a-431f-b8cd-67562ec18593-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b5b5efe6-729a-431f-b8cd-67562ec18593" (UID: "b5b5efe6-729a-431f-b8cd-67562ec18593"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.002805 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b5efe6-729a-431f-b8cd-67562ec18593-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b5b5efe6-729a-431f-b8cd-67562ec18593" (UID: "b5b5efe6-729a-431f-b8cd-67562ec18593"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.077601 4851 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.077635 4851 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.077650 4851 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.077660 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkkqh\" (UniqueName: \"kubernetes.io/projected/b5b5efe6-729a-431f-b8cd-67562ec18593-kube-api-access-xkkqh\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.077673 4851 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b5efe6-729a-431f-b8cd-67562ec18593-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.077682 4851 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5b5efe6-729a-431f-b8cd-67562ec18593-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.077690 4851 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5b5efe6-729a-431f-b8cd-67562ec18593-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.328740 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-xjvqh_b5b5efe6-729a-431f-b8cd-67562ec18593/console/0.log" Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.328805 4851 generic.go:334] "Generic (PLEG): container finished" podID="b5b5efe6-729a-431f-b8cd-67562ec18593" containerID="4f661b194509e41fcefde57718e0f5f850d35928f579e9235d1e3621cab8ae53" exitCode=2 Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.328844 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xjvqh" event={"ID":"b5b5efe6-729a-431f-b8cd-67562ec18593","Type":"ContainerDied","Data":"4f661b194509e41fcefde57718e0f5f850d35928f579e9235d1e3621cab8ae53"} Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.328868 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xjvqh" event={"ID":"b5b5efe6-729a-431f-b8cd-67562ec18593","Type":"ContainerDied","Data":"ea1ed7ee94da6c163f181cab7768b49a7f8eb8726aaeada4beba7cc0563adeae"} Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.328891 4851 scope.go:117] "RemoveContainer" containerID="4f661b194509e41fcefde57718e0f5f850d35928f579e9235d1e3621cab8ae53" Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.328957 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xjvqh" Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.355941 4851 scope.go:117] "RemoveContainer" containerID="4f661b194509e41fcefde57718e0f5f850d35928f579e9235d1e3621cab8ae53" Oct 01 13:06:05 crc kubenswrapper[4851]: E1001 13:06:05.357730 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f661b194509e41fcefde57718e0f5f850d35928f579e9235d1e3621cab8ae53\": container with ID starting with 4f661b194509e41fcefde57718e0f5f850d35928f579e9235d1e3621cab8ae53 not found: ID does not exist" containerID="4f661b194509e41fcefde57718e0f5f850d35928f579e9235d1e3621cab8ae53" Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.357798 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f661b194509e41fcefde57718e0f5f850d35928f579e9235d1e3621cab8ae53"} err="failed to get container status \"4f661b194509e41fcefde57718e0f5f850d35928f579e9235d1e3621cab8ae53\": rpc error: code = NotFound desc = could not find container \"4f661b194509e41fcefde57718e0f5f850d35928f579e9235d1e3621cab8ae53\": container with ID starting with 4f661b194509e41fcefde57718e0f5f850d35928f579e9235d1e3621cab8ae53 not found: ID does not exist" Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.384285 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-xjvqh"] Oct 01 13:06:05 crc kubenswrapper[4851]: I1001 13:06:05.388408 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-xjvqh"] Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.254304 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-477gz"] Oct 01 13:06:06 crc kubenswrapper[4851]: E1001 13:06:06.254742 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b5efe6-729a-431f-b8cd-67562ec18593" containerName="console" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.254769 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b5efe6-729a-431f-b8cd-67562ec18593" containerName="console" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.254915 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5b5efe6-729a-431f-b8cd-67562ec18593" containerName="console" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.256107 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.271813 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-477gz"] Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.339102 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b5efe6-729a-431f-b8cd-67562ec18593" path="/var/lib/kubelet/pods/b5b5efe6-729a-431f-b8cd-67562ec18593/volumes" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.396942 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdcjv\" (UniqueName: \"kubernetes.io/projected/506e30be-ed2d-446b-8ea1-cc3c0916638b-kube-api-access-mdcjv\") pod \"redhat-marketplace-477gz\" (UID: \"506e30be-ed2d-446b-8ea1-cc3c0916638b\") " pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.397278 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506e30be-ed2d-446b-8ea1-cc3c0916638b-utilities\") pod \"redhat-marketplace-477gz\" (UID: \"506e30be-ed2d-446b-8ea1-cc3c0916638b\") " pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.397426 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506e30be-ed2d-446b-8ea1-cc3c0916638b-catalog-content\") pod \"redhat-marketplace-477gz\" (UID: \"506e30be-ed2d-446b-8ea1-cc3c0916638b\") " pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.499244 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506e30be-ed2d-446b-8ea1-cc3c0916638b-utilities\") pod \"redhat-marketplace-477gz\" (UID: \"506e30be-ed2d-446b-8ea1-cc3c0916638b\") " pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.499328 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506e30be-ed2d-446b-8ea1-cc3c0916638b-catalog-content\") pod \"redhat-marketplace-477gz\" (UID: \"506e30be-ed2d-446b-8ea1-cc3c0916638b\") " pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.499550 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdcjv\" (UniqueName: \"kubernetes.io/projected/506e30be-ed2d-446b-8ea1-cc3c0916638b-kube-api-access-mdcjv\") pod \"redhat-marketplace-477gz\" (UID: \"506e30be-ed2d-446b-8ea1-cc3c0916638b\") " pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.500988 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506e30be-ed2d-446b-8ea1-cc3c0916638b-catalog-content\") pod \"redhat-marketplace-477gz\" (UID: \"506e30be-ed2d-446b-8ea1-cc3c0916638b\") " pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.501279 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506e30be-ed2d-446b-8ea1-cc3c0916638b-utilities\") pod \"redhat-marketplace-477gz\" (UID: \"506e30be-ed2d-446b-8ea1-cc3c0916638b\") " pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.537091 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdcjv\" (UniqueName: \"kubernetes.io/projected/506e30be-ed2d-446b-8ea1-cc3c0916638b-kube-api-access-mdcjv\") pod \"redhat-marketplace-477gz\" (UID: \"506e30be-ed2d-446b-8ea1-cc3c0916638b\") " pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.596204 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.887493 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf"] Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.889356 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.891136 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 13:06:06 crc kubenswrapper[4851]: I1001 13:06:06.898531 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf"] Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.006240 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ddb9987b-6077-405c-bcc2-15d713fb434d-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf\" (UID: \"ddb9987b-6077-405c-bcc2-15d713fb434d\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.006690 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k4hl\" (UniqueName: \"kubernetes.io/projected/ddb9987b-6077-405c-bcc2-15d713fb434d-kube-api-access-7k4hl\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf\" (UID: \"ddb9987b-6077-405c-bcc2-15d713fb434d\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.006845 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ddb9987b-6077-405c-bcc2-15d713fb434d-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf\" (UID: \"ddb9987b-6077-405c-bcc2-15d713fb434d\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.069822 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-477gz"] Oct 01 13:06:07 crc kubenswrapper[4851]: W1001 13:06:07.086550 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod506e30be_ed2d_446b_8ea1_cc3c0916638b.slice/crio-4b8bf7518617b4823e20ada91c8ce5608babc3259eedc9aec3cd2ae0c9b16393 WatchSource:0}: Error finding container 4b8bf7518617b4823e20ada91c8ce5608babc3259eedc9aec3cd2ae0c9b16393: Status 404 returned error can't find the container with id 4b8bf7518617b4823e20ada91c8ce5608babc3259eedc9aec3cd2ae0c9b16393 Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.108695 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ddb9987b-6077-405c-bcc2-15d713fb434d-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf\" (UID: \"ddb9987b-6077-405c-bcc2-15d713fb434d\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.109415 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ddb9987b-6077-405c-bcc2-15d713fb434d-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf\" (UID: \"ddb9987b-6077-405c-bcc2-15d713fb434d\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.114733 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ddb9987b-6077-405c-bcc2-15d713fb434d-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf\" (UID: \"ddb9987b-6077-405c-bcc2-15d713fb434d\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.114785 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k4hl\" (UniqueName: \"kubernetes.io/projected/ddb9987b-6077-405c-bcc2-15d713fb434d-kube-api-access-7k4hl\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf\" (UID: \"ddb9987b-6077-405c-bcc2-15d713fb434d\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.115101 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ddb9987b-6077-405c-bcc2-15d713fb434d-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf\" (UID: \"ddb9987b-6077-405c-bcc2-15d713fb434d\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.140194 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k4hl\" (UniqueName: \"kubernetes.io/projected/ddb9987b-6077-405c-bcc2-15d713fb434d-kube-api-access-7k4hl\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf\" (UID: \"ddb9987b-6077-405c-bcc2-15d713fb434d\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.206378 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.352998 4851 generic.go:334] "Generic (PLEG): container finished" podID="506e30be-ed2d-446b-8ea1-cc3c0916638b" containerID="8cd25c12efc8a031eaa0ff419ac471a3e4b103dd7431cce39e6afcd970e71b87" exitCode=0 Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.354004 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-477gz" event={"ID":"506e30be-ed2d-446b-8ea1-cc3c0916638b","Type":"ContainerDied","Data":"8cd25c12efc8a031eaa0ff419ac471a3e4b103dd7431cce39e6afcd970e71b87"} Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.354053 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-477gz" event={"ID":"506e30be-ed2d-446b-8ea1-cc3c0916638b","Type":"ContainerStarted","Data":"4b8bf7518617b4823e20ada91c8ce5608babc3259eedc9aec3cd2ae0c9b16393"} Oct 01 13:06:07 crc kubenswrapper[4851]: I1001 13:06:07.636058 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf"] Oct 01 13:06:07 crc kubenswrapper[4851]: W1001 13:06:07.643043 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddb9987b_6077_405c_bcc2_15d713fb434d.slice/crio-7a06a72bbed840e07f65981158ef02373de832c4170228534804da53c95e635f WatchSource:0}: Error finding container 7a06a72bbed840e07f65981158ef02373de832c4170228534804da53c95e635f: Status 404 returned error can't find the container with id 7a06a72bbed840e07f65981158ef02373de832c4170228534804da53c95e635f Oct 01 13:06:08 crc kubenswrapper[4851]: I1001 13:06:08.364005 4851 generic.go:334] "Generic (PLEG): container finished" podID="ddb9987b-6077-405c-bcc2-15d713fb434d" containerID="a79d67ca38c02e684198840b42baac0199265451c08231439a08db3f390dcbf1" exitCode=0 Oct 01 13:06:08 crc kubenswrapper[4851]: I1001 13:06:08.364099 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" event={"ID":"ddb9987b-6077-405c-bcc2-15d713fb434d","Type":"ContainerDied","Data":"a79d67ca38c02e684198840b42baac0199265451c08231439a08db3f390dcbf1"} Oct 01 13:06:08 crc kubenswrapper[4851]: I1001 13:06:08.364412 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" event={"ID":"ddb9987b-6077-405c-bcc2-15d713fb434d","Type":"ContainerStarted","Data":"7a06a72bbed840e07f65981158ef02373de832c4170228534804da53c95e635f"} Oct 01 13:06:09 crc kubenswrapper[4851]: I1001 13:06:09.373549 4851 generic.go:334] "Generic (PLEG): container finished" podID="506e30be-ed2d-446b-8ea1-cc3c0916638b" containerID="9e3837c8ea53c0681faea6080098c7b3191e75c755342b21913bb2e170d33782" exitCode=0 Oct 01 13:06:09 crc kubenswrapper[4851]: I1001 13:06:09.373607 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-477gz" event={"ID":"506e30be-ed2d-446b-8ea1-cc3c0916638b","Type":"ContainerDied","Data":"9e3837c8ea53c0681faea6080098c7b3191e75c755342b21913bb2e170d33782"} Oct 01 13:06:10 crc kubenswrapper[4851]: I1001 13:06:10.384474 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-477gz" event={"ID":"506e30be-ed2d-446b-8ea1-cc3c0916638b","Type":"ContainerStarted","Data":"6a47bb43692cd57273824d86c6218eb84765aa8796f412b207b701e4b5b70085"} Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.391876 4851 generic.go:334] "Generic (PLEG): container finished" podID="ddb9987b-6077-405c-bcc2-15d713fb434d" containerID="e87efecbc5ca3bde29844639fa574df8ed8fb5ec76525558679f0f1f315b6566" exitCode=0 Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.391920 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" event={"ID":"ddb9987b-6077-405c-bcc2-15d713fb434d","Type":"ContainerDied","Data":"e87efecbc5ca3bde29844639fa574df8ed8fb5ec76525558679f0f1f315b6566"} Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.414007 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-477gz" podStartSLOduration=2.942440834 podStartE2EDuration="5.413990922s" podCreationTimestamp="2025-10-01 13:06:06 +0000 UTC" firstStartedPulling="2025-10-01 13:06:07.355074498 +0000 UTC m=+775.700191994" lastFinishedPulling="2025-10-01 13:06:09.826624556 +0000 UTC m=+778.171742082" observedRunningTime="2025-10-01 13:06:10.409160326 +0000 UTC m=+778.754277852" watchObservedRunningTime="2025-10-01 13:06:11.413990922 +0000 UTC m=+779.759108408" Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.641493 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bgnmp"] Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.643387 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.658873 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bgnmp"] Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.787423 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2k66\" (UniqueName: \"kubernetes.io/projected/4c1a9063-23de-46a7-bd5e-8763dee075c4-kube-api-access-c2k66\") pod \"redhat-operators-bgnmp\" (UID: \"4c1a9063-23de-46a7-bd5e-8763dee075c4\") " pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.787689 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c1a9063-23de-46a7-bd5e-8763dee075c4-utilities\") pod \"redhat-operators-bgnmp\" (UID: \"4c1a9063-23de-46a7-bd5e-8763dee075c4\") " pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.787731 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c1a9063-23de-46a7-bd5e-8763dee075c4-catalog-content\") pod \"redhat-operators-bgnmp\" (UID: \"4c1a9063-23de-46a7-bd5e-8763dee075c4\") " pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.889215 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2k66\" (UniqueName: \"kubernetes.io/projected/4c1a9063-23de-46a7-bd5e-8763dee075c4-kube-api-access-c2k66\") pod \"redhat-operators-bgnmp\" (UID: \"4c1a9063-23de-46a7-bd5e-8763dee075c4\") " pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.889291 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c1a9063-23de-46a7-bd5e-8763dee075c4-utilities\") pod \"redhat-operators-bgnmp\" (UID: \"4c1a9063-23de-46a7-bd5e-8763dee075c4\") " pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.889344 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c1a9063-23de-46a7-bd5e-8763dee075c4-catalog-content\") pod \"redhat-operators-bgnmp\" (UID: \"4c1a9063-23de-46a7-bd5e-8763dee075c4\") " pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.889905 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c1a9063-23de-46a7-bd5e-8763dee075c4-catalog-content\") pod \"redhat-operators-bgnmp\" (UID: \"4c1a9063-23de-46a7-bd5e-8763dee075c4\") " pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.889994 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c1a9063-23de-46a7-bd5e-8763dee075c4-utilities\") pod \"redhat-operators-bgnmp\" (UID: \"4c1a9063-23de-46a7-bd5e-8763dee075c4\") " pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.908155 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2k66\" (UniqueName: \"kubernetes.io/projected/4c1a9063-23de-46a7-bd5e-8763dee075c4-kube-api-access-c2k66\") pod \"redhat-operators-bgnmp\" (UID: \"4c1a9063-23de-46a7-bd5e-8763dee075c4\") " pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:06:11 crc kubenswrapper[4851]: I1001 13:06:11.987402 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:06:12 crc kubenswrapper[4851]: I1001 13:06:12.398844 4851 generic.go:334] "Generic (PLEG): container finished" podID="ddb9987b-6077-405c-bcc2-15d713fb434d" containerID="db3dfe9010808733e92d2c310f8cf297afff1049e0fe9d57f3bd77ef1f218599" exitCode=0 Oct 01 13:06:12 crc kubenswrapper[4851]: I1001 13:06:12.398885 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" event={"ID":"ddb9987b-6077-405c-bcc2-15d713fb434d","Type":"ContainerDied","Data":"db3dfe9010808733e92d2c310f8cf297afff1049e0fe9d57f3bd77ef1f218599"} Oct 01 13:06:12 crc kubenswrapper[4851]: I1001 13:06:12.461092 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bgnmp"] Oct 01 13:06:13 crc kubenswrapper[4851]: I1001 13:06:13.410702 4851 generic.go:334] "Generic (PLEG): container finished" podID="4c1a9063-23de-46a7-bd5e-8763dee075c4" containerID="f1e3a7166347ff4893fa30909d716f21f7ede06539914aa7369cc660186dd358" exitCode=0 Oct 01 13:06:13 crc kubenswrapper[4851]: I1001 13:06:13.410794 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgnmp" event={"ID":"4c1a9063-23de-46a7-bd5e-8763dee075c4","Type":"ContainerDied","Data":"f1e3a7166347ff4893fa30909d716f21f7ede06539914aa7369cc660186dd358"} Oct 01 13:06:13 crc kubenswrapper[4851]: I1001 13:06:13.410864 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgnmp" event={"ID":"4c1a9063-23de-46a7-bd5e-8763dee075c4","Type":"ContainerStarted","Data":"f798997327e7e637539a91b21493dfa4a4bbf069ae464b7db374d304264e1c9d"} Oct 01 13:06:13 crc kubenswrapper[4851]: I1001 13:06:13.832058 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" Oct 01 13:06:13 crc kubenswrapper[4851]: I1001 13:06:13.925078 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ddb9987b-6077-405c-bcc2-15d713fb434d-bundle\") pod \"ddb9987b-6077-405c-bcc2-15d713fb434d\" (UID: \"ddb9987b-6077-405c-bcc2-15d713fb434d\") " Oct 01 13:06:13 crc kubenswrapper[4851]: I1001 13:06:13.925215 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k4hl\" (UniqueName: \"kubernetes.io/projected/ddb9987b-6077-405c-bcc2-15d713fb434d-kube-api-access-7k4hl\") pod \"ddb9987b-6077-405c-bcc2-15d713fb434d\" (UID: \"ddb9987b-6077-405c-bcc2-15d713fb434d\") " Oct 01 13:06:13 crc kubenswrapper[4851]: I1001 13:06:13.925340 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ddb9987b-6077-405c-bcc2-15d713fb434d-util\") pod \"ddb9987b-6077-405c-bcc2-15d713fb434d\" (UID: \"ddb9987b-6077-405c-bcc2-15d713fb434d\") " Oct 01 13:06:13 crc kubenswrapper[4851]: I1001 13:06:13.926619 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddb9987b-6077-405c-bcc2-15d713fb434d-bundle" (OuterVolumeSpecName: "bundle") pod "ddb9987b-6077-405c-bcc2-15d713fb434d" (UID: "ddb9987b-6077-405c-bcc2-15d713fb434d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:13 crc kubenswrapper[4851]: I1001 13:06:13.933967 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddb9987b-6077-405c-bcc2-15d713fb434d-kube-api-access-7k4hl" (OuterVolumeSpecName: "kube-api-access-7k4hl") pod "ddb9987b-6077-405c-bcc2-15d713fb434d" (UID: "ddb9987b-6077-405c-bcc2-15d713fb434d"). InnerVolumeSpecName "kube-api-access-7k4hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:13 crc kubenswrapper[4851]: I1001 13:06:13.948108 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddb9987b-6077-405c-bcc2-15d713fb434d-util" (OuterVolumeSpecName: "util") pod "ddb9987b-6077-405c-bcc2-15d713fb434d" (UID: "ddb9987b-6077-405c-bcc2-15d713fb434d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:14 crc kubenswrapper[4851]: I1001 13:06:14.026638 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k4hl\" (UniqueName: \"kubernetes.io/projected/ddb9987b-6077-405c-bcc2-15d713fb434d-kube-api-access-7k4hl\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:14 crc kubenswrapper[4851]: I1001 13:06:14.026688 4851 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ddb9987b-6077-405c-bcc2-15d713fb434d-util\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:14 crc kubenswrapper[4851]: I1001 13:06:14.026701 4851 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ddb9987b-6077-405c-bcc2-15d713fb434d-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:14 crc kubenswrapper[4851]: I1001 13:06:14.422179 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" event={"ID":"ddb9987b-6077-405c-bcc2-15d713fb434d","Type":"ContainerDied","Data":"7a06a72bbed840e07f65981158ef02373de832c4170228534804da53c95e635f"} Oct 01 13:06:14 crc kubenswrapper[4851]: I1001 13:06:14.422246 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a06a72bbed840e07f65981158ef02373de832c4170228534804da53c95e635f" Oct 01 13:06:14 crc kubenswrapper[4851]: I1001 13:06:14.422370 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf" Oct 01 13:06:16 crc kubenswrapper[4851]: I1001 13:06:16.597152 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:16 crc kubenswrapper[4851]: I1001 13:06:16.597231 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:16 crc kubenswrapper[4851]: I1001 13:06:16.640585 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:17 crc kubenswrapper[4851]: I1001 13:06:17.494413 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:19 crc kubenswrapper[4851]: I1001 13:06:19.837394 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-477gz"] Oct 01 13:06:19 crc kubenswrapper[4851]: I1001 13:06:19.837949 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-477gz" podUID="506e30be-ed2d-446b-8ea1-cc3c0916638b" containerName="registry-server" containerID="cri-o://6a47bb43692cd57273824d86c6218eb84765aa8796f412b207b701e4b5b70085" gracePeriod=2 Oct 01 13:06:20 crc kubenswrapper[4851]: I1001 13:06:20.466945 4851 generic.go:334] "Generic (PLEG): container finished" podID="506e30be-ed2d-446b-8ea1-cc3c0916638b" containerID="6a47bb43692cd57273824d86c6218eb84765aa8796f412b207b701e4b5b70085" exitCode=0 Oct 01 13:06:20 crc kubenswrapper[4851]: I1001 13:06:20.467136 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-477gz" event={"ID":"506e30be-ed2d-446b-8ea1-cc3c0916638b","Type":"ContainerDied","Data":"6a47bb43692cd57273824d86c6218eb84765aa8796f412b207b701e4b5b70085"} Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.449920 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6"] Oct 01 13:06:24 crc kubenswrapper[4851]: E1001 13:06:24.450577 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb9987b-6077-405c-bcc2-15d713fb434d" containerName="extract" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.450590 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb9987b-6077-405c-bcc2-15d713fb434d" containerName="extract" Oct 01 13:06:24 crc kubenswrapper[4851]: E1001 13:06:24.450604 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb9987b-6077-405c-bcc2-15d713fb434d" containerName="util" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.450610 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb9987b-6077-405c-bcc2-15d713fb434d" containerName="util" Oct 01 13:06:24 crc kubenswrapper[4851]: E1001 13:06:24.450624 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb9987b-6077-405c-bcc2-15d713fb434d" containerName="pull" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.450630 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb9987b-6077-405c-bcc2-15d713fb434d" containerName="pull" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.450740 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddb9987b-6077-405c-bcc2-15d713fb434d" containerName="extract" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.451083 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.452704 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.453226 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.453251 4851 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.453414 4851 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-f62sq" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.453553 4851 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.460689 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6"] Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.531638 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9-apiservice-cert\") pod \"metallb-operator-controller-manager-bc7c7cbf4-vznd6\" (UID: \"f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9\") " pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.531850 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9-webhook-cert\") pod \"metallb-operator-controller-manager-bc7c7cbf4-vznd6\" (UID: \"f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9\") " pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.531919 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2ngl\" (UniqueName: \"kubernetes.io/projected/f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9-kube-api-access-f2ngl\") pod \"metallb-operator-controller-manager-bc7c7cbf4-vznd6\" (UID: \"f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9\") " pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.633025 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9-apiservice-cert\") pod \"metallb-operator-controller-manager-bc7c7cbf4-vznd6\" (UID: \"f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9\") " pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.633102 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9-webhook-cert\") pod \"metallb-operator-controller-manager-bc7c7cbf4-vznd6\" (UID: \"f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9\") " pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.633132 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2ngl\" (UniqueName: \"kubernetes.io/projected/f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9-kube-api-access-f2ngl\") pod \"metallb-operator-controller-manager-bc7c7cbf4-vznd6\" (UID: \"f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9\") " pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.653420 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9-webhook-cert\") pod \"metallb-operator-controller-manager-bc7c7cbf4-vznd6\" (UID: \"f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9\") " pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.653479 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9-apiservice-cert\") pod \"metallb-operator-controller-manager-bc7c7cbf4-vznd6\" (UID: \"f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9\") " pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.666322 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2ngl\" (UniqueName: \"kubernetes.io/projected/f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9-kube-api-access-f2ngl\") pod \"metallb-operator-controller-manager-bc7c7cbf4-vznd6\" (UID: \"f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9\") " pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.765110 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.811404 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf"] Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.812153 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.815148 4851 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-gpk69" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.815317 4851 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.815623 4851 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.835848 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf"] Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.836717 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8stv\" (UniqueName: \"kubernetes.io/projected/f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c-kube-api-access-w8stv\") pod \"metallb-operator-webhook-server-57f9d579c4-n58rf\" (UID: \"f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c\") " pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.836789 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c-webhook-cert\") pod \"metallb-operator-webhook-server-57f9d579c4-n58rf\" (UID: \"f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c\") " pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.836834 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c-apiservice-cert\") pod \"metallb-operator-webhook-server-57f9d579c4-n58rf\" (UID: \"f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c\") " pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.937711 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c-apiservice-cert\") pod \"metallb-operator-webhook-server-57f9d579c4-n58rf\" (UID: \"f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c\") " pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.940735 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8stv\" (UniqueName: \"kubernetes.io/projected/f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c-kube-api-access-w8stv\") pod \"metallb-operator-webhook-server-57f9d579c4-n58rf\" (UID: \"f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c\") " pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.940810 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c-webhook-cert\") pod \"metallb-operator-webhook-server-57f9d579c4-n58rf\" (UID: \"f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c\") " pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.949272 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c-apiservice-cert\") pod \"metallb-operator-webhook-server-57f9d579c4-n58rf\" (UID: \"f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c\") " pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.951061 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c-webhook-cert\") pod \"metallb-operator-webhook-server-57f9d579c4-n58rf\" (UID: \"f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c\") " pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" Oct 01 13:06:24 crc kubenswrapper[4851]: I1001 13:06:24.963417 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8stv\" (UniqueName: \"kubernetes.io/projected/f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c-kube-api-access-w8stv\") pod \"metallb-operator-webhook-server-57f9d579c4-n58rf\" (UID: \"f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c\") " pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" Oct 01 13:06:25 crc kubenswrapper[4851]: I1001 13:06:25.130472 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" Oct 01 13:06:26 crc kubenswrapper[4851]: E1001 13:06:26.597886 4851 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6a47bb43692cd57273824d86c6218eb84765aa8796f412b207b701e4b5b70085 is running failed: container process not found" containerID="6a47bb43692cd57273824d86c6218eb84765aa8796f412b207b701e4b5b70085" cmd=["grpc_health_probe","-addr=:50051"] Oct 01 13:06:26 crc kubenswrapper[4851]: E1001 13:06:26.598712 4851 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6a47bb43692cd57273824d86c6218eb84765aa8796f412b207b701e4b5b70085 is running failed: container process not found" containerID="6a47bb43692cd57273824d86c6218eb84765aa8796f412b207b701e4b5b70085" cmd=["grpc_health_probe","-addr=:50051"] Oct 01 13:06:26 crc kubenswrapper[4851]: E1001 13:06:26.599201 4851 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6a47bb43692cd57273824d86c6218eb84765aa8796f412b207b701e4b5b70085 is running failed: container process not found" containerID="6a47bb43692cd57273824d86c6218eb84765aa8796f412b207b701e4b5b70085" cmd=["grpc_health_probe","-addr=:50051"] Oct 01 13:06:26 crc kubenswrapper[4851]: E1001 13:06:26.599339 4851 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6a47bb43692cd57273824d86c6218eb84765aa8796f412b207b701e4b5b70085 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-477gz" podUID="506e30be-ed2d-446b-8ea1-cc3c0916638b" containerName="registry-server" Oct 01 13:06:27 crc kubenswrapper[4851]: I1001 13:06:27.237812 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9t8f2"] Oct 01 13:06:27 crc kubenswrapper[4851]: I1001 13:06:27.239382 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:27 crc kubenswrapper[4851]: I1001 13:06:27.251644 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9t8f2"] Oct 01 13:06:27 crc kubenswrapper[4851]: I1001 13:06:27.308438 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7297cb-e287-4224-909c-c51f28349ff4-catalog-content\") pod \"certified-operators-9t8f2\" (UID: \"3e7297cb-e287-4224-909c-c51f28349ff4\") " pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:27 crc kubenswrapper[4851]: I1001 13:06:27.308491 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvcdm\" (UniqueName: \"kubernetes.io/projected/3e7297cb-e287-4224-909c-c51f28349ff4-kube-api-access-wvcdm\") pod \"certified-operators-9t8f2\" (UID: \"3e7297cb-e287-4224-909c-c51f28349ff4\") " pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:27 crc kubenswrapper[4851]: I1001 13:06:27.308552 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7297cb-e287-4224-909c-c51f28349ff4-utilities\") pod \"certified-operators-9t8f2\" (UID: \"3e7297cb-e287-4224-909c-c51f28349ff4\") " pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:27 crc kubenswrapper[4851]: I1001 13:06:27.411668 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7297cb-e287-4224-909c-c51f28349ff4-catalog-content\") pod \"certified-operators-9t8f2\" (UID: \"3e7297cb-e287-4224-909c-c51f28349ff4\") " pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:27 crc kubenswrapper[4851]: I1001 13:06:27.411728 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvcdm\" (UniqueName: \"kubernetes.io/projected/3e7297cb-e287-4224-909c-c51f28349ff4-kube-api-access-wvcdm\") pod \"certified-operators-9t8f2\" (UID: \"3e7297cb-e287-4224-909c-c51f28349ff4\") " pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:27 crc kubenswrapper[4851]: I1001 13:06:27.411787 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7297cb-e287-4224-909c-c51f28349ff4-utilities\") pod \"certified-operators-9t8f2\" (UID: \"3e7297cb-e287-4224-909c-c51f28349ff4\") " pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:27 crc kubenswrapper[4851]: I1001 13:06:27.412249 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7297cb-e287-4224-909c-c51f28349ff4-utilities\") pod \"certified-operators-9t8f2\" (UID: \"3e7297cb-e287-4224-909c-c51f28349ff4\") " pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:27 crc kubenswrapper[4851]: I1001 13:06:27.412457 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7297cb-e287-4224-909c-c51f28349ff4-catalog-content\") pod \"certified-operators-9t8f2\" (UID: \"3e7297cb-e287-4224-909c-c51f28349ff4\") " pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:27 crc kubenswrapper[4851]: I1001 13:06:27.432289 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvcdm\" (UniqueName: \"kubernetes.io/projected/3e7297cb-e287-4224-909c-c51f28349ff4-kube-api-access-wvcdm\") pod \"certified-operators-9t8f2\" (UID: \"3e7297cb-e287-4224-909c-c51f28349ff4\") " pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:27 crc kubenswrapper[4851]: I1001 13:06:27.617411 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.027915 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:29 crc kubenswrapper[4851]: E1001 13:06:29.084073 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 13:06:29 crc kubenswrapper[4851]: E1001 13:06:29.084228 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c2k66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-bgnmp_openshift-marketplace(4c1a9063-23de-46a7-bd5e-8763dee075c4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:06:29 crc kubenswrapper[4851]: E1001 13:06:29.085540 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-bgnmp" podUID="4c1a9063-23de-46a7-bd5e-8763dee075c4" Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.231636 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506e30be-ed2d-446b-8ea1-cc3c0916638b-catalog-content\") pod \"506e30be-ed2d-446b-8ea1-cc3c0916638b\" (UID: \"506e30be-ed2d-446b-8ea1-cc3c0916638b\") " Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.231949 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdcjv\" (UniqueName: \"kubernetes.io/projected/506e30be-ed2d-446b-8ea1-cc3c0916638b-kube-api-access-mdcjv\") pod \"506e30be-ed2d-446b-8ea1-cc3c0916638b\" (UID: \"506e30be-ed2d-446b-8ea1-cc3c0916638b\") " Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.232046 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506e30be-ed2d-446b-8ea1-cc3c0916638b-utilities\") pod \"506e30be-ed2d-446b-8ea1-cc3c0916638b\" (UID: \"506e30be-ed2d-446b-8ea1-cc3c0916638b\") " Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.232872 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506e30be-ed2d-446b-8ea1-cc3c0916638b-utilities" (OuterVolumeSpecName: "utilities") pod "506e30be-ed2d-446b-8ea1-cc3c0916638b" (UID: "506e30be-ed2d-446b-8ea1-cc3c0916638b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.237511 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/506e30be-ed2d-446b-8ea1-cc3c0916638b-kube-api-access-mdcjv" (OuterVolumeSpecName: "kube-api-access-mdcjv") pod "506e30be-ed2d-446b-8ea1-cc3c0916638b" (UID: "506e30be-ed2d-446b-8ea1-cc3c0916638b"). InnerVolumeSpecName "kube-api-access-mdcjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.242864 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506e30be-ed2d-446b-8ea1-cc3c0916638b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "506e30be-ed2d-446b-8ea1-cc3c0916638b" (UID: "506e30be-ed2d-446b-8ea1-cc3c0916638b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.336549 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506e30be-ed2d-446b-8ea1-cc3c0916638b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.336615 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdcjv\" (UniqueName: \"kubernetes.io/projected/506e30be-ed2d-446b-8ea1-cc3c0916638b-kube-api-access-mdcjv\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.336635 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506e30be-ed2d-446b-8ea1-cc3c0916638b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.522845 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf"] Oct 01 13:06:29 crc kubenswrapper[4851]: W1001 13:06:29.537811 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8382d60_6b68_4f9a_8aa0_8fefd3c2de4c.slice/crio-704a4c3340b86be40d6651b1ccab0ba12a2880587f69accd44f3560179646edd WatchSource:0}: Error finding container 704a4c3340b86be40d6651b1ccab0ba12a2880587f69accd44f3560179646edd: Status 404 returned error can't find the container with id 704a4c3340b86be40d6651b1ccab0ba12a2880587f69accd44f3560179646edd Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.546269 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-477gz" event={"ID":"506e30be-ed2d-446b-8ea1-cc3c0916638b","Type":"ContainerDied","Data":"4b8bf7518617b4823e20ada91c8ce5608babc3259eedc9aec3cd2ae0c9b16393"} Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.546337 4851 scope.go:117] "RemoveContainer" containerID="6a47bb43692cd57273824d86c6218eb84765aa8796f412b207b701e4b5b70085" Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.546481 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-477gz" Oct 01 13:06:29 crc kubenswrapper[4851]: E1001 13:06:29.548339 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-bgnmp" podUID="4c1a9063-23de-46a7-bd5e-8763dee075c4" Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.551826 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6"] Oct 01 13:06:29 crc kubenswrapper[4851]: W1001 13:06:29.565625 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4a3620c_32ab_4a6b_9a3b_0ca2e11868e9.slice/crio-08d14b8009787345b3c218772214eb2c1b063a1b5f0e8b182b2dceec6633fff7 WatchSource:0}: Error finding container 08d14b8009787345b3c218772214eb2c1b063a1b5f0e8b182b2dceec6633fff7: Status 404 returned error can't find the container with id 08d14b8009787345b3c218772214eb2c1b063a1b5f0e8b182b2dceec6633fff7 Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.571045 4851 scope.go:117] "RemoveContainer" containerID="9e3837c8ea53c0681faea6080098c7b3191e75c755342b21913bb2e170d33782" Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.598625 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-477gz"] Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.603529 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-477gz"] Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.611121 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9t8f2"] Oct 01 13:06:29 crc kubenswrapper[4851]: I1001 13:06:29.619761 4851 scope.go:117] "RemoveContainer" containerID="8cd25c12efc8a031eaa0ff419ac471a3e4b103dd7431cce39e6afcd970e71b87" Oct 01 13:06:30 crc kubenswrapper[4851]: I1001 13:06:30.335224 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="506e30be-ed2d-446b-8ea1-cc3c0916638b" path="/var/lib/kubelet/pods/506e30be-ed2d-446b-8ea1-cc3c0916638b/volumes" Oct 01 13:06:30 crc kubenswrapper[4851]: I1001 13:06:30.553202 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" event={"ID":"f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c","Type":"ContainerStarted","Data":"704a4c3340b86be40d6651b1ccab0ba12a2880587f69accd44f3560179646edd"} Oct 01 13:06:30 crc kubenswrapper[4851]: I1001 13:06:30.554271 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" event={"ID":"f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9","Type":"ContainerStarted","Data":"08d14b8009787345b3c218772214eb2c1b063a1b5f0e8b182b2dceec6633fff7"} Oct 01 13:06:30 crc kubenswrapper[4851]: I1001 13:06:30.556655 4851 generic.go:334] "Generic (PLEG): container finished" podID="3e7297cb-e287-4224-909c-c51f28349ff4" containerID="f1ee6697602015058212f830de207357c0db4aabd490a63de72f4ad6847e4c21" exitCode=0 Oct 01 13:06:30 crc kubenswrapper[4851]: I1001 13:06:30.556690 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9t8f2" event={"ID":"3e7297cb-e287-4224-909c-c51f28349ff4","Type":"ContainerDied","Data":"f1ee6697602015058212f830de207357c0db4aabd490a63de72f4ad6847e4c21"} Oct 01 13:06:30 crc kubenswrapper[4851]: I1001 13:06:30.556713 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9t8f2" event={"ID":"3e7297cb-e287-4224-909c-c51f28349ff4","Type":"ContainerStarted","Data":"45342f0e4a145f6c883eb319d88767f9035cb5c9affc625d9e3b53f2b4395d41"} Oct 01 13:06:31 crc kubenswrapper[4851]: I1001 13:06:31.565901 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9t8f2" event={"ID":"3e7297cb-e287-4224-909c-c51f28349ff4","Type":"ContainerStarted","Data":"f8f31a51f8bfe809f4c2e0f4399c2b8201736a5f236c0b1c6d442f2fcc559609"} Oct 01 13:06:32 crc kubenswrapper[4851]: I1001 13:06:32.576154 4851 generic.go:334] "Generic (PLEG): container finished" podID="3e7297cb-e287-4224-909c-c51f28349ff4" containerID="f8f31a51f8bfe809f4c2e0f4399c2b8201736a5f236c0b1c6d442f2fcc559609" exitCode=0 Oct 01 13:06:32 crc kubenswrapper[4851]: I1001 13:06:32.576202 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9t8f2" event={"ID":"3e7297cb-e287-4224-909c-c51f28349ff4","Type":"ContainerDied","Data":"f8f31a51f8bfe809f4c2e0f4399c2b8201736a5f236c0b1c6d442f2fcc559609"} Oct 01 13:06:35 crc kubenswrapper[4851]: I1001 13:06:35.598625 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" event={"ID":"f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c","Type":"ContainerStarted","Data":"38f5d5afa522c66d7527ee7214bd387d83fbd4e213c7a57a27d058a053f883ac"} Oct 01 13:06:35 crc kubenswrapper[4851]: I1001 13:06:35.599141 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" Oct 01 13:06:35 crc kubenswrapper[4851]: I1001 13:06:35.600395 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9t8f2" event={"ID":"3e7297cb-e287-4224-909c-c51f28349ff4","Type":"ContainerStarted","Data":"e433dc2f43e2adfbae2d0a0b2ab5c38e9b211139b5a17171565e4b5c54c66f7f"} Oct 01 13:06:35 crc kubenswrapper[4851]: I1001 13:06:35.633224 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" podStartSLOduration=6.638546663 podStartE2EDuration="11.633206566s" podCreationTimestamp="2025-10-01 13:06:24 +0000 UTC" firstStartedPulling="2025-10-01 13:06:29.541227762 +0000 UTC m=+797.886345248" lastFinishedPulling="2025-10-01 13:06:34.535887665 +0000 UTC m=+802.881005151" observedRunningTime="2025-10-01 13:06:35.617361828 +0000 UTC m=+803.962479314" watchObservedRunningTime="2025-10-01 13:06:35.633206566 +0000 UTC m=+803.978324042" Oct 01 13:06:35 crc kubenswrapper[4851]: I1001 13:06:35.652332 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9t8f2" podStartSLOduration=4.699979116 podStartE2EDuration="8.652316558s" podCreationTimestamp="2025-10-01 13:06:27 +0000 UTC" firstStartedPulling="2025-10-01 13:06:30.558320764 +0000 UTC m=+798.903438250" lastFinishedPulling="2025-10-01 13:06:34.510658196 +0000 UTC m=+802.855775692" observedRunningTime="2025-10-01 13:06:35.648539239 +0000 UTC m=+803.993656725" watchObservedRunningTime="2025-10-01 13:06:35.652316558 +0000 UTC m=+803.997434044" Oct 01 13:06:37 crc kubenswrapper[4851]: I1001 13:06:37.617793 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:37 crc kubenswrapper[4851]: I1001 13:06:37.618141 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:37 crc kubenswrapper[4851]: I1001 13:06:37.618158 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" event={"ID":"f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9","Type":"ContainerStarted","Data":"c09c080b1ab9fd25979eadcbd5065eca8b08430bceb2ae343f489612446bb770"} Oct 01 13:06:37 crc kubenswrapper[4851]: I1001 13:06:37.619428 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" Oct 01 13:06:37 crc kubenswrapper[4851]: I1001 13:06:37.650636 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" podStartSLOduration=6.185095484 podStartE2EDuration="13.650617483s" podCreationTimestamp="2025-10-01 13:06:24 +0000 UTC" firstStartedPulling="2025-10-01 13:06:29.575721099 +0000 UTC m=+797.920838585" lastFinishedPulling="2025-10-01 13:06:37.041243088 +0000 UTC m=+805.386360584" observedRunningTime="2025-10-01 13:06:37.647624657 +0000 UTC m=+805.992742153" watchObservedRunningTime="2025-10-01 13:06:37.650617483 +0000 UTC m=+805.995734969" Oct 01 13:06:37 crc kubenswrapper[4851]: I1001 13:06:37.684063 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.243544 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q4rrn"] Oct 01 13:06:43 crc kubenswrapper[4851]: E1001 13:06:43.243982 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506e30be-ed2d-446b-8ea1-cc3c0916638b" containerName="extract-content" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.243995 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="506e30be-ed2d-446b-8ea1-cc3c0916638b" containerName="extract-content" Oct 01 13:06:43 crc kubenswrapper[4851]: E1001 13:06:43.244005 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506e30be-ed2d-446b-8ea1-cc3c0916638b" containerName="registry-server" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.244012 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="506e30be-ed2d-446b-8ea1-cc3c0916638b" containerName="registry-server" Oct 01 13:06:43 crc kubenswrapper[4851]: E1001 13:06:43.244023 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506e30be-ed2d-446b-8ea1-cc3c0916638b" containerName="extract-utilities" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.244029 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="506e30be-ed2d-446b-8ea1-cc3c0916638b" containerName="extract-utilities" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.244149 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="506e30be-ed2d-446b-8ea1-cc3c0916638b" containerName="registry-server" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.244899 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.263595 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q4rrn"] Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.343106 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtsp7\" (UniqueName: \"kubernetes.io/projected/180fbebf-14ac-401d-bd39-1b4238cc2d2e-kube-api-access-mtsp7\") pod \"community-operators-q4rrn\" (UID: \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\") " pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.343556 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/180fbebf-14ac-401d-bd39-1b4238cc2d2e-utilities\") pod \"community-operators-q4rrn\" (UID: \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\") " pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.343589 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/180fbebf-14ac-401d-bd39-1b4238cc2d2e-catalog-content\") pod \"community-operators-q4rrn\" (UID: \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\") " pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.444743 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/180fbebf-14ac-401d-bd39-1b4238cc2d2e-utilities\") pod \"community-operators-q4rrn\" (UID: \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\") " pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.445307 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/180fbebf-14ac-401d-bd39-1b4238cc2d2e-utilities\") pod \"community-operators-q4rrn\" (UID: \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\") " pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.445385 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/180fbebf-14ac-401d-bd39-1b4238cc2d2e-catalog-content\") pod \"community-operators-q4rrn\" (UID: \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\") " pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.445716 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/180fbebf-14ac-401d-bd39-1b4238cc2d2e-catalog-content\") pod \"community-operators-q4rrn\" (UID: \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\") " pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.446694 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtsp7\" (UniqueName: \"kubernetes.io/projected/180fbebf-14ac-401d-bd39-1b4238cc2d2e-kube-api-access-mtsp7\") pod \"community-operators-q4rrn\" (UID: \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\") " pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.471415 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtsp7\" (UniqueName: \"kubernetes.io/projected/180fbebf-14ac-401d-bd39-1b4238cc2d2e-kube-api-access-mtsp7\") pod \"community-operators-q4rrn\" (UID: \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\") " pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.558646 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:06:43 crc kubenswrapper[4851]: I1001 13:06:43.678493 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgnmp" event={"ID":"4c1a9063-23de-46a7-bd5e-8763dee075c4","Type":"ContainerStarted","Data":"8f4068f5deb06cb40483af2d60ad18948b42d3d29a87bfc2c306bd08251bc8a5"} Oct 01 13:06:44 crc kubenswrapper[4851]: I1001 13:06:44.020092 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q4rrn"] Oct 01 13:06:44 crc kubenswrapper[4851]: W1001 13:06:44.025407 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod180fbebf_14ac_401d_bd39_1b4238cc2d2e.slice/crio-b38e4a9fe7ea3577590a8c42c9fee23bfb345856f935c3238fff572e37ad1975 WatchSource:0}: Error finding container b38e4a9fe7ea3577590a8c42c9fee23bfb345856f935c3238fff572e37ad1975: Status 404 returned error can't find the container with id b38e4a9fe7ea3577590a8c42c9fee23bfb345856f935c3238fff572e37ad1975 Oct 01 13:06:44 crc kubenswrapper[4851]: I1001 13:06:44.685985 4851 generic.go:334] "Generic (PLEG): container finished" podID="180fbebf-14ac-401d-bd39-1b4238cc2d2e" containerID="66b57dd92841b58c5c24f5996dc26e270c83562a439bd06680c7a2fe52c9379e" exitCode=0 Oct 01 13:06:44 crc kubenswrapper[4851]: I1001 13:06:44.686069 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4rrn" event={"ID":"180fbebf-14ac-401d-bd39-1b4238cc2d2e","Type":"ContainerDied","Data":"66b57dd92841b58c5c24f5996dc26e270c83562a439bd06680c7a2fe52c9379e"} Oct 01 13:06:44 crc kubenswrapper[4851]: I1001 13:06:44.686344 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4rrn" event={"ID":"180fbebf-14ac-401d-bd39-1b4238cc2d2e","Type":"ContainerStarted","Data":"b38e4a9fe7ea3577590a8c42c9fee23bfb345856f935c3238fff572e37ad1975"} Oct 01 13:06:44 crc kubenswrapper[4851]: I1001 13:06:44.688434 4851 generic.go:334] "Generic (PLEG): container finished" podID="4c1a9063-23de-46a7-bd5e-8763dee075c4" containerID="8f4068f5deb06cb40483af2d60ad18948b42d3d29a87bfc2c306bd08251bc8a5" exitCode=0 Oct 01 13:06:44 crc kubenswrapper[4851]: I1001 13:06:44.688493 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgnmp" event={"ID":"4c1a9063-23de-46a7-bd5e-8763dee075c4","Type":"ContainerDied","Data":"8f4068f5deb06cb40483af2d60ad18948b42d3d29a87bfc2c306bd08251bc8a5"} Oct 01 13:06:45 crc kubenswrapper[4851]: I1001 13:06:45.142847 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-57f9d579c4-n58rf" Oct 01 13:06:47 crc kubenswrapper[4851]: I1001 13:06:47.681484 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:49 crc kubenswrapper[4851]: I1001 13:06:49.746083 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgnmp" event={"ID":"4c1a9063-23de-46a7-bd5e-8763dee075c4","Type":"ContainerStarted","Data":"1e677bce908489306abaf9db0dec5ba46d7e6847074252fb7662563135a009a0"} Oct 01 13:06:50 crc kubenswrapper[4851]: I1001 13:06:50.434046 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9t8f2"] Oct 01 13:06:50 crc kubenswrapper[4851]: I1001 13:06:50.434282 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9t8f2" podUID="3e7297cb-e287-4224-909c-c51f28349ff4" containerName="registry-server" containerID="cri-o://e433dc2f43e2adfbae2d0a0b2ab5c38e9b211139b5a17171565e4b5c54c66f7f" gracePeriod=2 Oct 01 13:06:50 crc kubenswrapper[4851]: I1001 13:06:50.782474 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bgnmp" podStartSLOduration=7.780944832 podStartE2EDuration="39.78243201s" podCreationTimestamp="2025-10-01 13:06:11 +0000 UTC" firstStartedPulling="2025-10-01 13:06:13.413810632 +0000 UTC m=+781.758928118" lastFinishedPulling="2025-10-01 13:06:45.41529781 +0000 UTC m=+813.760415296" observedRunningTime="2025-10-01 13:06:50.774110299 +0000 UTC m=+819.119227775" watchObservedRunningTime="2025-10-01 13:06:50.78243201 +0000 UTC m=+819.127549536" Oct 01 13:06:51 crc kubenswrapper[4851]: I1001 13:06:51.764034 4851 generic.go:334] "Generic (PLEG): container finished" podID="3e7297cb-e287-4224-909c-c51f28349ff4" containerID="e433dc2f43e2adfbae2d0a0b2ab5c38e9b211139b5a17171565e4b5c54c66f7f" exitCode=0 Oct 01 13:06:51 crc kubenswrapper[4851]: I1001 13:06:51.764144 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9t8f2" event={"ID":"3e7297cb-e287-4224-909c-c51f28349ff4","Type":"ContainerDied","Data":"e433dc2f43e2adfbae2d0a0b2ab5c38e9b211139b5a17171565e4b5c54c66f7f"} Oct 01 13:06:51 crc kubenswrapper[4851]: I1001 13:06:51.988643 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:06:51 crc kubenswrapper[4851]: I1001 13:06:51.988709 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.251865 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.421023 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvcdm\" (UniqueName: \"kubernetes.io/projected/3e7297cb-e287-4224-909c-c51f28349ff4-kube-api-access-wvcdm\") pod \"3e7297cb-e287-4224-909c-c51f28349ff4\" (UID: \"3e7297cb-e287-4224-909c-c51f28349ff4\") " Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.421219 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7297cb-e287-4224-909c-c51f28349ff4-utilities\") pod \"3e7297cb-e287-4224-909c-c51f28349ff4\" (UID: \"3e7297cb-e287-4224-909c-c51f28349ff4\") " Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.422327 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e7297cb-e287-4224-909c-c51f28349ff4-utilities" (OuterVolumeSpecName: "utilities") pod "3e7297cb-e287-4224-909c-c51f28349ff4" (UID: "3e7297cb-e287-4224-909c-c51f28349ff4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.422417 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7297cb-e287-4224-909c-c51f28349ff4-catalog-content\") pod \"3e7297cb-e287-4224-909c-c51f28349ff4\" (UID: \"3e7297cb-e287-4224-909c-c51f28349ff4\") " Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.423620 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7297cb-e287-4224-909c-c51f28349ff4-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.433156 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e7297cb-e287-4224-909c-c51f28349ff4-kube-api-access-wvcdm" (OuterVolumeSpecName: "kube-api-access-wvcdm") pod "3e7297cb-e287-4224-909c-c51f28349ff4" (UID: "3e7297cb-e287-4224-909c-c51f28349ff4"). InnerVolumeSpecName "kube-api-access-wvcdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.483089 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e7297cb-e287-4224-909c-c51f28349ff4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e7297cb-e287-4224-909c-c51f28349ff4" (UID: "3e7297cb-e287-4224-909c-c51f28349ff4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.524757 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7297cb-e287-4224-909c-c51f28349ff4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.524796 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvcdm\" (UniqueName: \"kubernetes.io/projected/3e7297cb-e287-4224-909c-c51f28349ff4-kube-api-access-wvcdm\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.773644 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9t8f2" event={"ID":"3e7297cb-e287-4224-909c-c51f28349ff4","Type":"ContainerDied","Data":"45342f0e4a145f6c883eb319d88767f9035cb5c9affc625d9e3b53f2b4395d41"} Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.773732 4851 scope.go:117] "RemoveContainer" containerID="e433dc2f43e2adfbae2d0a0b2ab5c38e9b211139b5a17171565e4b5c54c66f7f" Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.773726 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9t8f2" Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.831228 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9t8f2"] Oct 01 13:06:52 crc kubenswrapper[4851]: I1001 13:06:52.836271 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9t8f2"] Oct 01 13:06:53 crc kubenswrapper[4851]: I1001 13:06:53.070826 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bgnmp" podUID="4c1a9063-23de-46a7-bd5e-8763dee075c4" containerName="registry-server" probeResult="failure" output=< Oct 01 13:06:53 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 13:06:53 crc kubenswrapper[4851]: > Oct 01 13:06:54 crc kubenswrapper[4851]: I1001 13:06:54.368387 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e7297cb-e287-4224-909c-c51f28349ff4" path="/var/lib/kubelet/pods/3e7297cb-e287-4224-909c-c51f28349ff4/volumes" Oct 01 13:06:55 crc kubenswrapper[4851]: I1001 13:06:55.399453 4851 scope.go:117] "RemoveContainer" containerID="f8f31a51f8bfe809f4c2e0f4399c2b8201736a5f236c0b1c6d442f2fcc559609" Oct 01 13:06:55 crc kubenswrapper[4851]: I1001 13:06:55.437569 4851 scope.go:117] "RemoveContainer" containerID="f1ee6697602015058212f830de207357c0db4aabd490a63de72f4ad6847e4c21" Oct 01 13:06:55 crc kubenswrapper[4851]: I1001 13:06:55.797689 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4rrn" event={"ID":"180fbebf-14ac-401d-bd39-1b4238cc2d2e","Type":"ContainerStarted","Data":"cbef644caba019a6397bcbd24a21ed08ef78c941716580eba1ee2b8c1840637b"} Oct 01 13:06:56 crc kubenswrapper[4851]: I1001 13:06:56.811230 4851 generic.go:334] "Generic (PLEG): container finished" podID="180fbebf-14ac-401d-bd39-1b4238cc2d2e" containerID="cbef644caba019a6397bcbd24a21ed08ef78c941716580eba1ee2b8c1840637b" exitCode=0 Oct 01 13:06:56 crc kubenswrapper[4851]: I1001 13:06:56.811332 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4rrn" event={"ID":"180fbebf-14ac-401d-bd39-1b4238cc2d2e","Type":"ContainerDied","Data":"cbef644caba019a6397bcbd24a21ed08ef78c941716580eba1ee2b8c1840637b"} Oct 01 13:06:57 crc kubenswrapper[4851]: I1001 13:06:57.820626 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4rrn" event={"ID":"180fbebf-14ac-401d-bd39-1b4238cc2d2e","Type":"ContainerStarted","Data":"efdb079af2a5a46afa5f16052739368d559d1ceb5edaa3225f36ab2cdb077295"} Oct 01 13:06:57 crc kubenswrapper[4851]: I1001 13:06:57.850202 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q4rrn" podStartSLOduration=2.3166060059999998 podStartE2EDuration="14.850171909s" podCreationTimestamp="2025-10-01 13:06:43 +0000 UTC" firstStartedPulling="2025-10-01 13:06:44.687892583 +0000 UTC m=+813.033010069" lastFinishedPulling="2025-10-01 13:06:57.221458446 +0000 UTC m=+825.566575972" observedRunningTime="2025-10-01 13:06:57.845329329 +0000 UTC m=+826.190446825" watchObservedRunningTime="2025-10-01 13:06:57.850171909 +0000 UTC m=+826.195289435" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.060077 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.142967 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bgnmp" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.250252 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bgnmp"] Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.303373 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7cggq"] Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.303997 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7cggq" podUID="dd203b3c-a861-4dba-9df9-7328c541c294" containerName="registry-server" containerID="cri-o://27e6c8b9f04e2c3569d8005e4ccaceade26cf36f4b102f3346baf7ac41949437" gracePeriod=2 Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.684240 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.853606 4851 generic.go:334] "Generic (PLEG): container finished" podID="dd203b3c-a861-4dba-9df9-7328c541c294" containerID="27e6c8b9f04e2c3569d8005e4ccaceade26cf36f4b102f3346baf7ac41949437" exitCode=0 Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.853687 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7cggq" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.853686 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7cggq" event={"ID":"dd203b3c-a861-4dba-9df9-7328c541c294","Type":"ContainerDied","Data":"27e6c8b9f04e2c3569d8005e4ccaceade26cf36f4b102f3346baf7ac41949437"} Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.854044 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7cggq" event={"ID":"dd203b3c-a861-4dba-9df9-7328c541c294","Type":"ContainerDied","Data":"a3ecadc3f2b6fadba7b481b44fa04eb47c9b2949c84a63e15f0dad3b7f0aa25d"} Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.854067 4851 scope.go:117] "RemoveContainer" containerID="27e6c8b9f04e2c3569d8005e4ccaceade26cf36f4b102f3346baf7ac41949437" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.869672 4851 scope.go:117] "RemoveContainer" containerID="8748275bc06f5a9c8a794a6ab481219cd137f446c1adc70b8e2563cbec3867d9" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.872633 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd203b3c-a861-4dba-9df9-7328c541c294-utilities\") pod \"dd203b3c-a861-4dba-9df9-7328c541c294\" (UID: \"dd203b3c-a861-4dba-9df9-7328c541c294\") " Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.872685 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd203b3c-a861-4dba-9df9-7328c541c294-catalog-content\") pod \"dd203b3c-a861-4dba-9df9-7328c541c294\" (UID: \"dd203b3c-a861-4dba-9df9-7328c541c294\") " Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.872775 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh5z2\" (UniqueName: \"kubernetes.io/projected/dd203b3c-a861-4dba-9df9-7328c541c294-kube-api-access-xh5z2\") pod \"dd203b3c-a861-4dba-9df9-7328c541c294\" (UID: \"dd203b3c-a861-4dba-9df9-7328c541c294\") " Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.873646 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd203b3c-a861-4dba-9df9-7328c541c294-utilities" (OuterVolumeSpecName: "utilities") pod "dd203b3c-a861-4dba-9df9-7328c541c294" (UID: "dd203b3c-a861-4dba-9df9-7328c541c294"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.877549 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd203b3c-a861-4dba-9df9-7328c541c294-kube-api-access-xh5z2" (OuterVolumeSpecName: "kube-api-access-xh5z2") pod "dd203b3c-a861-4dba-9df9-7328c541c294" (UID: "dd203b3c-a861-4dba-9df9-7328c541c294"). InnerVolumeSpecName "kube-api-access-xh5z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.896935 4851 scope.go:117] "RemoveContainer" containerID="372be6bd50b706252f67c40bf146019dde8dcf14fc6a9e744fdf8fa8c4afef0a" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.926710 4851 scope.go:117] "RemoveContainer" containerID="27e6c8b9f04e2c3569d8005e4ccaceade26cf36f4b102f3346baf7ac41949437" Oct 01 13:07:02 crc kubenswrapper[4851]: E1001 13:07:02.927241 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e6c8b9f04e2c3569d8005e4ccaceade26cf36f4b102f3346baf7ac41949437\": container with ID starting with 27e6c8b9f04e2c3569d8005e4ccaceade26cf36f4b102f3346baf7ac41949437 not found: ID does not exist" containerID="27e6c8b9f04e2c3569d8005e4ccaceade26cf36f4b102f3346baf7ac41949437" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.927278 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e6c8b9f04e2c3569d8005e4ccaceade26cf36f4b102f3346baf7ac41949437"} err="failed to get container status \"27e6c8b9f04e2c3569d8005e4ccaceade26cf36f4b102f3346baf7ac41949437\": rpc error: code = NotFound desc = could not find container \"27e6c8b9f04e2c3569d8005e4ccaceade26cf36f4b102f3346baf7ac41949437\": container with ID starting with 27e6c8b9f04e2c3569d8005e4ccaceade26cf36f4b102f3346baf7ac41949437 not found: ID does not exist" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.927306 4851 scope.go:117] "RemoveContainer" containerID="8748275bc06f5a9c8a794a6ab481219cd137f446c1adc70b8e2563cbec3867d9" Oct 01 13:07:02 crc kubenswrapper[4851]: E1001 13:07:02.927671 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8748275bc06f5a9c8a794a6ab481219cd137f446c1adc70b8e2563cbec3867d9\": container with ID starting with 8748275bc06f5a9c8a794a6ab481219cd137f446c1adc70b8e2563cbec3867d9 not found: ID does not exist" containerID="8748275bc06f5a9c8a794a6ab481219cd137f446c1adc70b8e2563cbec3867d9" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.927699 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8748275bc06f5a9c8a794a6ab481219cd137f446c1adc70b8e2563cbec3867d9"} err="failed to get container status \"8748275bc06f5a9c8a794a6ab481219cd137f446c1adc70b8e2563cbec3867d9\": rpc error: code = NotFound desc = could not find container \"8748275bc06f5a9c8a794a6ab481219cd137f446c1adc70b8e2563cbec3867d9\": container with ID starting with 8748275bc06f5a9c8a794a6ab481219cd137f446c1adc70b8e2563cbec3867d9 not found: ID does not exist" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.927717 4851 scope.go:117] "RemoveContainer" containerID="372be6bd50b706252f67c40bf146019dde8dcf14fc6a9e744fdf8fa8c4afef0a" Oct 01 13:07:02 crc kubenswrapper[4851]: E1001 13:07:02.928029 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"372be6bd50b706252f67c40bf146019dde8dcf14fc6a9e744fdf8fa8c4afef0a\": container with ID starting with 372be6bd50b706252f67c40bf146019dde8dcf14fc6a9e744fdf8fa8c4afef0a not found: ID does not exist" containerID="372be6bd50b706252f67c40bf146019dde8dcf14fc6a9e744fdf8fa8c4afef0a" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.928053 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"372be6bd50b706252f67c40bf146019dde8dcf14fc6a9e744fdf8fa8c4afef0a"} err="failed to get container status \"372be6bd50b706252f67c40bf146019dde8dcf14fc6a9e744fdf8fa8c4afef0a\": rpc error: code = NotFound desc = could not find container \"372be6bd50b706252f67c40bf146019dde8dcf14fc6a9e744fdf8fa8c4afef0a\": container with ID starting with 372be6bd50b706252f67c40bf146019dde8dcf14fc6a9e744fdf8fa8c4afef0a not found: ID does not exist" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.971167 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd203b3c-a861-4dba-9df9-7328c541c294-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd203b3c-a861-4dba-9df9-7328c541c294" (UID: "dd203b3c-a861-4dba-9df9-7328c541c294"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.974952 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd203b3c-a861-4dba-9df9-7328c541c294-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.974994 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd203b3c-a861-4dba-9df9-7328c541c294-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:02 crc kubenswrapper[4851]: I1001 13:07:02.975012 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh5z2\" (UniqueName: \"kubernetes.io/projected/dd203b3c-a861-4dba-9df9-7328c541c294-kube-api-access-xh5z2\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:03 crc kubenswrapper[4851]: I1001 13:07:03.181736 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7cggq"] Oct 01 13:07:03 crc kubenswrapper[4851]: I1001 13:07:03.188464 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7cggq"] Oct 01 13:07:03 crc kubenswrapper[4851]: I1001 13:07:03.559136 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:07:03 crc kubenswrapper[4851]: I1001 13:07:03.559189 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:07:03 crc kubenswrapper[4851]: I1001 13:07:03.611419 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:07:03 crc kubenswrapper[4851]: I1001 13:07:03.916435 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:07:04 crc kubenswrapper[4851]: I1001 13:07:04.340953 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd203b3c-a861-4dba-9df9-7328c541c294" path="/var/lib/kubelet/pods/dd203b3c-a861-4dba-9df9-7328c541c294/volumes" Oct 01 13:07:05 crc kubenswrapper[4851]: I1001 13:07:05.505697 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q4rrn"] Oct 01 13:07:05 crc kubenswrapper[4851]: I1001 13:07:05.872387 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q4rrn" podUID="180fbebf-14ac-401d-bd39-1b4238cc2d2e" containerName="registry-server" containerID="cri-o://efdb079af2a5a46afa5f16052739368d559d1ceb5edaa3225f36ab2cdb077295" gracePeriod=2 Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.222827 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.318819 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/180fbebf-14ac-401d-bd39-1b4238cc2d2e-utilities\") pod \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\" (UID: \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\") " Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.318940 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtsp7\" (UniqueName: \"kubernetes.io/projected/180fbebf-14ac-401d-bd39-1b4238cc2d2e-kube-api-access-mtsp7\") pod \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\" (UID: \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\") " Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.318980 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/180fbebf-14ac-401d-bd39-1b4238cc2d2e-catalog-content\") pod \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\" (UID: \"180fbebf-14ac-401d-bd39-1b4238cc2d2e\") " Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.320589 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/180fbebf-14ac-401d-bd39-1b4238cc2d2e-utilities" (OuterVolumeSpecName: "utilities") pod "180fbebf-14ac-401d-bd39-1b4238cc2d2e" (UID: "180fbebf-14ac-401d-bd39-1b4238cc2d2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.324721 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180fbebf-14ac-401d-bd39-1b4238cc2d2e-kube-api-access-mtsp7" (OuterVolumeSpecName: "kube-api-access-mtsp7") pod "180fbebf-14ac-401d-bd39-1b4238cc2d2e" (UID: "180fbebf-14ac-401d-bd39-1b4238cc2d2e"). InnerVolumeSpecName "kube-api-access-mtsp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.364764 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/180fbebf-14ac-401d-bd39-1b4238cc2d2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "180fbebf-14ac-401d-bd39-1b4238cc2d2e" (UID: "180fbebf-14ac-401d-bd39-1b4238cc2d2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.420635 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/180fbebf-14ac-401d-bd39-1b4238cc2d2e-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.420681 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtsp7\" (UniqueName: \"kubernetes.io/projected/180fbebf-14ac-401d-bd39-1b4238cc2d2e-kube-api-access-mtsp7\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.420691 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/180fbebf-14ac-401d-bd39-1b4238cc2d2e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.883713 4851 generic.go:334] "Generic (PLEG): container finished" podID="180fbebf-14ac-401d-bd39-1b4238cc2d2e" containerID="efdb079af2a5a46afa5f16052739368d559d1ceb5edaa3225f36ab2cdb077295" exitCode=0 Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.883774 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4rrn" event={"ID":"180fbebf-14ac-401d-bd39-1b4238cc2d2e","Type":"ContainerDied","Data":"efdb079af2a5a46afa5f16052739368d559d1ceb5edaa3225f36ab2cdb077295"} Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.883824 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4rrn" event={"ID":"180fbebf-14ac-401d-bd39-1b4238cc2d2e","Type":"ContainerDied","Data":"b38e4a9fe7ea3577590a8c42c9fee23bfb345856f935c3238fff572e37ad1975"} Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.883854 4851 scope.go:117] "RemoveContainer" containerID="efdb079af2a5a46afa5f16052739368d559d1ceb5edaa3225f36ab2cdb077295" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.883866 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q4rrn" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.911003 4851 scope.go:117] "RemoveContainer" containerID="cbef644caba019a6397bcbd24a21ed08ef78c941716580eba1ee2b8c1840637b" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.945561 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q4rrn"] Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.947902 4851 scope.go:117] "RemoveContainer" containerID="66b57dd92841b58c5c24f5996dc26e270c83562a439bd06680c7a2fe52c9379e" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.955868 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q4rrn"] Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.979188 4851 scope.go:117] "RemoveContainer" containerID="efdb079af2a5a46afa5f16052739368d559d1ceb5edaa3225f36ab2cdb077295" Oct 01 13:07:06 crc kubenswrapper[4851]: E1001 13:07:06.980035 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efdb079af2a5a46afa5f16052739368d559d1ceb5edaa3225f36ab2cdb077295\": container with ID starting with efdb079af2a5a46afa5f16052739368d559d1ceb5edaa3225f36ab2cdb077295 not found: ID does not exist" containerID="efdb079af2a5a46afa5f16052739368d559d1ceb5edaa3225f36ab2cdb077295" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.980093 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efdb079af2a5a46afa5f16052739368d559d1ceb5edaa3225f36ab2cdb077295"} err="failed to get container status \"efdb079af2a5a46afa5f16052739368d559d1ceb5edaa3225f36ab2cdb077295\": rpc error: code = NotFound desc = could not find container \"efdb079af2a5a46afa5f16052739368d559d1ceb5edaa3225f36ab2cdb077295\": container with ID starting with efdb079af2a5a46afa5f16052739368d559d1ceb5edaa3225f36ab2cdb077295 not found: ID does not exist" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.980129 4851 scope.go:117] "RemoveContainer" containerID="cbef644caba019a6397bcbd24a21ed08ef78c941716580eba1ee2b8c1840637b" Oct 01 13:07:06 crc kubenswrapper[4851]: E1001 13:07:06.980801 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbef644caba019a6397bcbd24a21ed08ef78c941716580eba1ee2b8c1840637b\": container with ID starting with cbef644caba019a6397bcbd24a21ed08ef78c941716580eba1ee2b8c1840637b not found: ID does not exist" containerID="cbef644caba019a6397bcbd24a21ed08ef78c941716580eba1ee2b8c1840637b" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.980866 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbef644caba019a6397bcbd24a21ed08ef78c941716580eba1ee2b8c1840637b"} err="failed to get container status \"cbef644caba019a6397bcbd24a21ed08ef78c941716580eba1ee2b8c1840637b\": rpc error: code = NotFound desc = could not find container \"cbef644caba019a6397bcbd24a21ed08ef78c941716580eba1ee2b8c1840637b\": container with ID starting with cbef644caba019a6397bcbd24a21ed08ef78c941716580eba1ee2b8c1840637b not found: ID does not exist" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.980896 4851 scope.go:117] "RemoveContainer" containerID="66b57dd92841b58c5c24f5996dc26e270c83562a439bd06680c7a2fe52c9379e" Oct 01 13:07:06 crc kubenswrapper[4851]: E1001 13:07:06.981248 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66b57dd92841b58c5c24f5996dc26e270c83562a439bd06680c7a2fe52c9379e\": container with ID starting with 66b57dd92841b58c5c24f5996dc26e270c83562a439bd06680c7a2fe52c9379e not found: ID does not exist" containerID="66b57dd92841b58c5c24f5996dc26e270c83562a439bd06680c7a2fe52c9379e" Oct 01 13:07:06 crc kubenswrapper[4851]: I1001 13:07:06.981277 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b57dd92841b58c5c24f5996dc26e270c83562a439bd06680c7a2fe52c9379e"} err="failed to get container status \"66b57dd92841b58c5c24f5996dc26e270c83562a439bd06680c7a2fe52c9379e\": rpc error: code = NotFound desc = could not find container \"66b57dd92841b58c5c24f5996dc26e270c83562a439bd06680c7a2fe52c9379e\": container with ID starting with 66b57dd92841b58c5c24f5996dc26e270c83562a439bd06680c7a2fe52c9379e not found: ID does not exist" Oct 01 13:07:08 crc kubenswrapper[4851]: I1001 13:07:08.342418 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="180fbebf-14ac-401d-bd39-1b4238cc2d2e" path="/var/lib/kubelet/pods/180fbebf-14ac-401d-bd39-1b4238cc2d2e/volumes" Oct 01 13:07:14 crc kubenswrapper[4851]: I1001 13:07:14.768646 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-bc7c7cbf4-vznd6" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.658395 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-8jcrr"] Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.658670 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd203b3c-a861-4dba-9df9-7328c541c294" containerName="extract-utilities" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.658688 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd203b3c-a861-4dba-9df9-7328c541c294" containerName="extract-utilities" Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.658703 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180fbebf-14ac-401d-bd39-1b4238cc2d2e" containerName="registry-server" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.658710 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="180fbebf-14ac-401d-bd39-1b4238cc2d2e" containerName="registry-server" Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.658724 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7297cb-e287-4224-909c-c51f28349ff4" containerName="extract-utilities" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.658732 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7297cb-e287-4224-909c-c51f28349ff4" containerName="extract-utilities" Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.658745 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd203b3c-a861-4dba-9df9-7328c541c294" containerName="extract-content" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.658752 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd203b3c-a861-4dba-9df9-7328c541c294" containerName="extract-content" Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.658763 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180fbebf-14ac-401d-bd39-1b4238cc2d2e" containerName="extract-utilities" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.658770 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="180fbebf-14ac-401d-bd39-1b4238cc2d2e" containerName="extract-utilities" Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.658781 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd203b3c-a861-4dba-9df9-7328c541c294" containerName="registry-server" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.658788 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd203b3c-a861-4dba-9df9-7328c541c294" containerName="registry-server" Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.658804 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7297cb-e287-4224-909c-c51f28349ff4" containerName="extract-content" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.658811 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7297cb-e287-4224-909c-c51f28349ff4" containerName="extract-content" Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.658823 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7297cb-e287-4224-909c-c51f28349ff4" containerName="registry-server" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.658830 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7297cb-e287-4224-909c-c51f28349ff4" containerName="registry-server" Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.658841 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180fbebf-14ac-401d-bd39-1b4238cc2d2e" containerName="extract-content" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.658848 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="180fbebf-14ac-401d-bd39-1b4238cc2d2e" containerName="extract-content" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.658969 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd203b3c-a861-4dba-9df9-7328c541c294" containerName="registry-server" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.658985 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7297cb-e287-4224-909c-c51f28349ff4" containerName="registry-server" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.658997 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="180fbebf-14ac-401d-bd39-1b4238cc2d2e" containerName="registry-server" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.661287 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.665033 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq"] Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.666547 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.687476 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.688029 4851 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-lvn69" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.688293 4851 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.690700 4851 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.702224 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq"] Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.755624 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-bgphr"] Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.756478 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bgphr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.759877 4851 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.760556 4851 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.761692 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.764393 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rcxj\" (UniqueName: \"kubernetes.io/projected/35ad8647-b8e1-4935-a669-ae418db665b7-kube-api-access-6rcxj\") pod \"frr-k8s-webhook-server-5478bdb765-wqlkq\" (UID: \"35ad8647-b8e1-4935-a669-ae418db665b7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.764434 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e3ef3f0b-665a-491b-857a-5ee2c5614f90-frr-sockets\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.764462 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e3ef3f0b-665a-491b-857a-5ee2c5614f90-metrics\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.764477 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw5zj\" (UniqueName: \"kubernetes.io/projected/e3ef3f0b-665a-491b-857a-5ee2c5614f90-kube-api-access-fw5zj\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.764509 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e3ef3f0b-665a-491b-857a-5ee2c5614f90-frr-conf\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.764533 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e3ef3f0b-665a-491b-857a-5ee2c5614f90-reloader\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.764677 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ef3f0b-665a-491b-857a-5ee2c5614f90-metrics-certs\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.764730 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e3ef3f0b-665a-491b-857a-5ee2c5614f90-frr-startup\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.764796 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ad8647-b8e1-4935-a669-ae418db665b7-cert\") pod \"frr-k8s-webhook-server-5478bdb765-wqlkq\" (UID: \"35ad8647-b8e1-4935-a669-ae418db665b7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.765799 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-wrnrn"] Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.766680 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-wrnrn" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.768690 4851 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.775636 4851 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-xlsbh" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.777462 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-wrnrn"] Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.865694 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rcxj\" (UniqueName: \"kubernetes.io/projected/35ad8647-b8e1-4935-a669-ae418db665b7-kube-api-access-6rcxj\") pod \"frr-k8s-webhook-server-5478bdb765-wqlkq\" (UID: \"35ad8647-b8e1-4935-a669-ae418db665b7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.865762 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e3ef3f0b-665a-491b-857a-5ee2c5614f90-frr-sockets\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.865793 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8n6n\" (UniqueName: \"kubernetes.io/projected/d8ce22ba-fa18-4924-9c48-360cd16f0857-kube-api-access-p8n6n\") pod \"speaker-bgphr\" (UID: \"d8ce22ba-fa18-4924-9c48-360cd16f0857\") " pod="metallb-system/speaker-bgphr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.865833 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8516a5e5-0b45-41d2-baa6-2680bc58eb9b-metrics-certs\") pod \"controller-5d688f5ffc-wrnrn\" (UID: \"8516a5e5-0b45-41d2-baa6-2680bc58eb9b\") " pod="metallb-system/controller-5d688f5ffc-wrnrn" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.865853 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e3ef3f0b-665a-491b-857a-5ee2c5614f90-metrics\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.865868 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw5zj\" (UniqueName: \"kubernetes.io/projected/e3ef3f0b-665a-491b-857a-5ee2c5614f90-kube-api-access-fw5zj\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.865899 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8ce22ba-fa18-4924-9c48-360cd16f0857-metrics-certs\") pod \"speaker-bgphr\" (UID: \"d8ce22ba-fa18-4924-9c48-360cd16f0857\") " pod="metallb-system/speaker-bgphr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.866123 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8516a5e5-0b45-41d2-baa6-2680bc58eb9b-cert\") pod \"controller-5d688f5ffc-wrnrn\" (UID: \"8516a5e5-0b45-41d2-baa6-2680bc58eb9b\") " pod="metallb-system/controller-5d688f5ffc-wrnrn" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.866139 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e3ef3f0b-665a-491b-857a-5ee2c5614f90-frr-conf\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.866154 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d8ce22ba-fa18-4924-9c48-360cd16f0857-metallb-excludel2\") pod \"speaker-bgphr\" (UID: \"d8ce22ba-fa18-4924-9c48-360cd16f0857\") " pod="metallb-system/speaker-bgphr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.866193 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e3ef3f0b-665a-491b-857a-5ee2c5614f90-reloader\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.866218 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d8ce22ba-fa18-4924-9c48-360cd16f0857-memberlist\") pod \"speaker-bgphr\" (UID: \"d8ce22ba-fa18-4924-9c48-360cd16f0857\") " pod="metallb-system/speaker-bgphr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.866235 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ef3f0b-665a-491b-857a-5ee2c5614f90-metrics-certs\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.866252 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e3ef3f0b-665a-491b-857a-5ee2c5614f90-frr-startup\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.866293 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ad8647-b8e1-4935-a669-ae418db665b7-cert\") pod \"frr-k8s-webhook-server-5478bdb765-wqlkq\" (UID: \"35ad8647-b8e1-4935-a669-ae418db665b7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.866308 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq7m2\" (UniqueName: \"kubernetes.io/projected/8516a5e5-0b45-41d2-baa6-2680bc58eb9b-kube-api-access-dq7m2\") pod \"controller-5d688f5ffc-wrnrn\" (UID: \"8516a5e5-0b45-41d2-baa6-2680bc58eb9b\") " pod="metallb-system/controller-5d688f5ffc-wrnrn" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.866367 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e3ef3f0b-665a-491b-857a-5ee2c5614f90-frr-sockets\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.866555 4851 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.866683 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35ad8647-b8e1-4935-a669-ae418db665b7-cert podName:35ad8647-b8e1-4935-a669-ae418db665b7 nodeName:}" failed. No retries permitted until 2025-10-01 13:07:16.366663092 +0000 UTC m=+844.711780588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35ad8647-b8e1-4935-a669-ae418db665b7-cert") pod "frr-k8s-webhook-server-5478bdb765-wqlkq" (UID: "35ad8647-b8e1-4935-a669-ae418db665b7") : secret "frr-k8s-webhook-server-cert" not found Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.866741 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e3ef3f0b-665a-491b-857a-5ee2c5614f90-frr-conf\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.866860 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e3ef3f0b-665a-491b-857a-5ee2c5614f90-metrics\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.867239 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e3ef3f0b-665a-491b-857a-5ee2c5614f90-frr-startup\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.867328 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e3ef3f0b-665a-491b-857a-5ee2c5614f90-reloader\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.871889 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ef3f0b-665a-491b-857a-5ee2c5614f90-metrics-certs\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.905135 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rcxj\" (UniqueName: \"kubernetes.io/projected/35ad8647-b8e1-4935-a669-ae418db665b7-kube-api-access-6rcxj\") pod \"frr-k8s-webhook-server-5478bdb765-wqlkq\" (UID: \"35ad8647-b8e1-4935-a669-ae418db665b7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.905587 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw5zj\" (UniqueName: \"kubernetes.io/projected/e3ef3f0b-665a-491b-857a-5ee2c5614f90-kube-api-access-fw5zj\") pod \"frr-k8s-8jcrr\" (UID: \"e3ef3f0b-665a-491b-857a-5ee2c5614f90\") " pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.967965 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq7m2\" (UniqueName: \"kubernetes.io/projected/8516a5e5-0b45-41d2-baa6-2680bc58eb9b-kube-api-access-dq7m2\") pod \"controller-5d688f5ffc-wrnrn\" (UID: \"8516a5e5-0b45-41d2-baa6-2680bc58eb9b\") " pod="metallb-system/controller-5d688f5ffc-wrnrn" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.968346 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8n6n\" (UniqueName: \"kubernetes.io/projected/d8ce22ba-fa18-4924-9c48-360cd16f0857-kube-api-access-p8n6n\") pod \"speaker-bgphr\" (UID: \"d8ce22ba-fa18-4924-9c48-360cd16f0857\") " pod="metallb-system/speaker-bgphr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.968424 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8516a5e5-0b45-41d2-baa6-2680bc58eb9b-metrics-certs\") pod \"controller-5d688f5ffc-wrnrn\" (UID: \"8516a5e5-0b45-41d2-baa6-2680bc58eb9b\") " pod="metallb-system/controller-5d688f5ffc-wrnrn" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.968515 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8ce22ba-fa18-4924-9c48-360cd16f0857-metrics-certs\") pod \"speaker-bgphr\" (UID: \"d8ce22ba-fa18-4924-9c48-360cd16f0857\") " pod="metallb-system/speaker-bgphr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.968605 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8516a5e5-0b45-41d2-baa6-2680bc58eb9b-cert\") pod \"controller-5d688f5ffc-wrnrn\" (UID: \"8516a5e5-0b45-41d2-baa6-2680bc58eb9b\") " pod="metallb-system/controller-5d688f5ffc-wrnrn" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.968680 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d8ce22ba-fa18-4924-9c48-360cd16f0857-metallb-excludel2\") pod \"speaker-bgphr\" (UID: \"d8ce22ba-fa18-4924-9c48-360cd16f0857\") " pod="metallb-system/speaker-bgphr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.968774 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d8ce22ba-fa18-4924-9c48-360cd16f0857-memberlist\") pod \"speaker-bgphr\" (UID: \"d8ce22ba-fa18-4924-9c48-360cd16f0857\") " pod="metallb-system/speaker-bgphr" Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.968568 4851 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.969037 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8516a5e5-0b45-41d2-baa6-2680bc58eb9b-metrics-certs podName:8516a5e5-0b45-41d2-baa6-2680bc58eb9b nodeName:}" failed. No retries permitted until 2025-10-01 13:07:16.469018019 +0000 UTC m=+844.814135505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8516a5e5-0b45-41d2-baa6-2680bc58eb9b-metrics-certs") pod "controller-5d688f5ffc-wrnrn" (UID: "8516a5e5-0b45-41d2-baa6-2680bc58eb9b") : secret "controller-certs-secret" not found Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.968947 4851 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 13:07:15 crc kubenswrapper[4851]: E1001 13:07:15.969176 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8ce22ba-fa18-4924-9c48-360cd16f0857-memberlist podName:d8ce22ba-fa18-4924-9c48-360cd16f0857 nodeName:}" failed. No retries permitted until 2025-10-01 13:07:16.469146763 +0000 UTC m=+844.814264249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d8ce22ba-fa18-4924-9c48-360cd16f0857-memberlist") pod "speaker-bgphr" (UID: "d8ce22ba-fa18-4924-9c48-360cd16f0857") : secret "metallb-memberlist" not found Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.969838 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d8ce22ba-fa18-4924-9c48-360cd16f0857-metallb-excludel2\") pod \"speaker-bgphr\" (UID: \"d8ce22ba-fa18-4924-9c48-360cd16f0857\") " pod="metallb-system/speaker-bgphr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.974537 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8ce22ba-fa18-4924-9c48-360cd16f0857-metrics-certs\") pod \"speaker-bgphr\" (UID: \"d8ce22ba-fa18-4924-9c48-360cd16f0857\") " pod="metallb-system/speaker-bgphr" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.987189 4851 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 01 13:07:15 crc kubenswrapper[4851]: I1001 13:07:15.993979 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8516a5e5-0b45-41d2-baa6-2680bc58eb9b-cert\") pod \"controller-5d688f5ffc-wrnrn\" (UID: \"8516a5e5-0b45-41d2-baa6-2680bc58eb9b\") " pod="metallb-system/controller-5d688f5ffc-wrnrn" Oct 01 13:07:16 crc kubenswrapper[4851]: I1001 13:07:16.002868 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq7m2\" (UniqueName: \"kubernetes.io/projected/8516a5e5-0b45-41d2-baa6-2680bc58eb9b-kube-api-access-dq7m2\") pod \"controller-5d688f5ffc-wrnrn\" (UID: \"8516a5e5-0b45-41d2-baa6-2680bc58eb9b\") " pod="metallb-system/controller-5d688f5ffc-wrnrn" Oct 01 13:07:16 crc kubenswrapper[4851]: I1001 13:07:16.004993 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:16 crc kubenswrapper[4851]: I1001 13:07:16.015066 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8n6n\" (UniqueName: \"kubernetes.io/projected/d8ce22ba-fa18-4924-9c48-360cd16f0857-kube-api-access-p8n6n\") pod \"speaker-bgphr\" (UID: \"d8ce22ba-fa18-4924-9c48-360cd16f0857\") " pod="metallb-system/speaker-bgphr" Oct 01 13:07:16 crc kubenswrapper[4851]: I1001 13:07:16.372479 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ad8647-b8e1-4935-a669-ae418db665b7-cert\") pod \"frr-k8s-webhook-server-5478bdb765-wqlkq\" (UID: \"35ad8647-b8e1-4935-a669-ae418db665b7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq" Oct 01 13:07:16 crc kubenswrapper[4851]: I1001 13:07:16.379497 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ad8647-b8e1-4935-a669-ae418db665b7-cert\") pod \"frr-k8s-webhook-server-5478bdb765-wqlkq\" (UID: \"35ad8647-b8e1-4935-a669-ae418db665b7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq" Oct 01 13:07:16 crc kubenswrapper[4851]: I1001 13:07:16.473710 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8516a5e5-0b45-41d2-baa6-2680bc58eb9b-metrics-certs\") pod \"controller-5d688f5ffc-wrnrn\" (UID: \"8516a5e5-0b45-41d2-baa6-2680bc58eb9b\") " pod="metallb-system/controller-5d688f5ffc-wrnrn" Oct 01 13:07:16 crc kubenswrapper[4851]: I1001 13:07:16.473941 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d8ce22ba-fa18-4924-9c48-360cd16f0857-memberlist\") pod \"speaker-bgphr\" (UID: \"d8ce22ba-fa18-4924-9c48-360cd16f0857\") " pod="metallb-system/speaker-bgphr" Oct 01 13:07:16 crc kubenswrapper[4851]: E1001 13:07:16.474444 4851 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 13:07:16 crc kubenswrapper[4851]: E1001 13:07:16.474690 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8ce22ba-fa18-4924-9c48-360cd16f0857-memberlist podName:d8ce22ba-fa18-4924-9c48-360cd16f0857 nodeName:}" failed. No retries permitted until 2025-10-01 13:07:17.474646379 +0000 UTC m=+845.819763905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d8ce22ba-fa18-4924-9c48-360cd16f0857-memberlist") pod "speaker-bgphr" (UID: "d8ce22ba-fa18-4924-9c48-360cd16f0857") : secret "metallb-memberlist" not found Oct 01 13:07:16 crc kubenswrapper[4851]: I1001 13:07:16.479281 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8516a5e5-0b45-41d2-baa6-2680bc58eb9b-metrics-certs\") pod \"controller-5d688f5ffc-wrnrn\" (UID: \"8516a5e5-0b45-41d2-baa6-2680bc58eb9b\") " pod="metallb-system/controller-5d688f5ffc-wrnrn" Oct 01 13:07:16 crc kubenswrapper[4851]: I1001 13:07:16.620931 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq" Oct 01 13:07:16 crc kubenswrapper[4851]: I1001 13:07:16.701186 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-wrnrn" Oct 01 13:07:16 crc kubenswrapper[4851]: I1001 13:07:16.968070 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jcrr" event={"ID":"e3ef3f0b-665a-491b-857a-5ee2c5614f90","Type":"ContainerStarted","Data":"a1761364c44da4c03e97ba56419adb4c6b75e07c5e0a0f3a9f2435ae7b33f7d4"} Oct 01 13:07:17 crc kubenswrapper[4851]: I1001 13:07:17.106000 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq"] Oct 01 13:07:17 crc kubenswrapper[4851]: I1001 13:07:17.199844 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-wrnrn"] Oct 01 13:07:17 crc kubenswrapper[4851]: W1001 13:07:17.205664 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8516a5e5_0b45_41d2_baa6_2680bc58eb9b.slice/crio-8ee90462c4336bfb0c34d04ccdfdc9a5f8fccdd93cb00857304ce3751f0799ec WatchSource:0}: Error finding container 8ee90462c4336bfb0c34d04ccdfdc9a5f8fccdd93cb00857304ce3751f0799ec: Status 404 returned error can't find the container with id 8ee90462c4336bfb0c34d04ccdfdc9a5f8fccdd93cb00857304ce3751f0799ec Oct 01 13:07:17 crc kubenswrapper[4851]: I1001 13:07:17.489599 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d8ce22ba-fa18-4924-9c48-360cd16f0857-memberlist\") pod \"speaker-bgphr\" (UID: \"d8ce22ba-fa18-4924-9c48-360cd16f0857\") " pod="metallb-system/speaker-bgphr" Oct 01 13:07:17 crc kubenswrapper[4851]: I1001 13:07:17.494620 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d8ce22ba-fa18-4924-9c48-360cd16f0857-memberlist\") pod \"speaker-bgphr\" (UID: \"d8ce22ba-fa18-4924-9c48-360cd16f0857\") " pod="metallb-system/speaker-bgphr" Oct 01 13:07:17 crc kubenswrapper[4851]: I1001 13:07:17.581944 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bgphr" Oct 01 13:07:17 crc kubenswrapper[4851]: W1001 13:07:17.598786 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8ce22ba_fa18_4924_9c48_360cd16f0857.slice/crio-1689985483cc13e6cebd4beb7de6b5c2c999d6e12a22867347d8aaccffaaa749 WatchSource:0}: Error finding container 1689985483cc13e6cebd4beb7de6b5c2c999d6e12a22867347d8aaccffaaa749: Status 404 returned error can't find the container with id 1689985483cc13e6cebd4beb7de6b5c2c999d6e12a22867347d8aaccffaaa749 Oct 01 13:07:17 crc kubenswrapper[4851]: I1001 13:07:17.977820 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-wrnrn" event={"ID":"8516a5e5-0b45-41d2-baa6-2680bc58eb9b","Type":"ContainerStarted","Data":"7ca352a29a6dfa98f20c64a7d9b1e3bdc99fe4a4b84c5ea2cf9dd57a6e311c3d"} Oct 01 13:07:17 crc kubenswrapper[4851]: I1001 13:07:17.978090 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-wrnrn" Oct 01 13:07:17 crc kubenswrapper[4851]: I1001 13:07:17.978102 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-wrnrn" event={"ID":"8516a5e5-0b45-41d2-baa6-2680bc58eb9b","Type":"ContainerStarted","Data":"6b534aea18c8087370f97119b0fc8a5adf7e22cb8d8a04dd47dcc7d9a413616a"} Oct 01 13:07:17 crc kubenswrapper[4851]: I1001 13:07:17.978113 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-wrnrn" event={"ID":"8516a5e5-0b45-41d2-baa6-2680bc58eb9b","Type":"ContainerStarted","Data":"8ee90462c4336bfb0c34d04ccdfdc9a5f8fccdd93cb00857304ce3751f0799ec"} Oct 01 13:07:17 crc kubenswrapper[4851]: I1001 13:07:17.979243 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq" event={"ID":"35ad8647-b8e1-4935-a669-ae418db665b7","Type":"ContainerStarted","Data":"b65173ae1e0be33446f5e31ea354c46d08647f19220a3dc579149fbc49fd9bab"} Oct 01 13:07:17 crc kubenswrapper[4851]: I1001 13:07:17.986120 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bgphr" event={"ID":"d8ce22ba-fa18-4924-9c48-360cd16f0857","Type":"ContainerStarted","Data":"5182aeecf5ec3b6220922eeb3b1912acafa25d73dfb0c10c267a93b840e2a7e5"} Oct 01 13:07:17 crc kubenswrapper[4851]: I1001 13:07:17.986160 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bgphr" event={"ID":"d8ce22ba-fa18-4924-9c48-360cd16f0857","Type":"ContainerStarted","Data":"1689985483cc13e6cebd4beb7de6b5c2c999d6e12a22867347d8aaccffaaa749"} Oct 01 13:07:18 crc kubenswrapper[4851]: I1001 13:07:17.995958 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-wrnrn" podStartSLOduration=2.995940414 podStartE2EDuration="2.995940414s" podCreationTimestamp="2025-10-01 13:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:17.993431452 +0000 UTC m=+846.338548938" watchObservedRunningTime="2025-10-01 13:07:17.995940414 +0000 UTC m=+846.341057900" Oct 01 13:07:18 crc kubenswrapper[4851]: I1001 13:07:18.996890 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bgphr" event={"ID":"d8ce22ba-fa18-4924-9c48-360cd16f0857","Type":"ContainerStarted","Data":"a26f167e8a897076f3cbe19ab96610726b1cfbea8556fc1296b8724d1b19cc29"} Oct 01 13:07:19 crc kubenswrapper[4851]: I1001 13:07:19.016108 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-bgphr" podStartSLOduration=4.016089845 podStartE2EDuration="4.016089845s" podCreationTimestamp="2025-10-01 13:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:19.014024546 +0000 UTC m=+847.359142032" watchObservedRunningTime="2025-10-01 13:07:19.016089845 +0000 UTC m=+847.361207331" Oct 01 13:07:20 crc kubenswrapper[4851]: I1001 13:07:20.001923 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-bgphr" Oct 01 13:07:24 crc kubenswrapper[4851]: I1001 13:07:24.051435 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq" event={"ID":"35ad8647-b8e1-4935-a669-ae418db665b7","Type":"ContainerStarted","Data":"26a6921be80c9d14dc7a42128ee8757d472655c35de50691b9ca6fe581862a24"} Oct 01 13:07:24 crc kubenswrapper[4851]: I1001 13:07:24.052301 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq" Oct 01 13:07:24 crc kubenswrapper[4851]: I1001 13:07:24.055946 4851 generic.go:334] "Generic (PLEG): container finished" podID="e3ef3f0b-665a-491b-857a-5ee2c5614f90" containerID="3ced099b7c2be7fce44c1e9a39102fc818e69e20ee639f7bd6a1a7ec8bc6151d" exitCode=0 Oct 01 13:07:24 crc kubenswrapper[4851]: I1001 13:07:24.055976 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jcrr" event={"ID":"e3ef3f0b-665a-491b-857a-5ee2c5614f90","Type":"ContainerDied","Data":"3ced099b7c2be7fce44c1e9a39102fc818e69e20ee639f7bd6a1a7ec8bc6151d"} Oct 01 13:07:24 crc kubenswrapper[4851]: I1001 13:07:24.085808 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq" podStartSLOduration=2.694089891 podStartE2EDuration="9.085784823s" podCreationTimestamp="2025-10-01 13:07:15 +0000 UTC" firstStartedPulling="2025-10-01 13:07:17.119969886 +0000 UTC m=+845.465087412" lastFinishedPulling="2025-10-01 13:07:23.511664868 +0000 UTC m=+851.856782344" observedRunningTime="2025-10-01 13:07:24.083603991 +0000 UTC m=+852.428721517" watchObservedRunningTime="2025-10-01 13:07:24.085784823 +0000 UTC m=+852.430902359" Oct 01 13:07:25 crc kubenswrapper[4851]: I1001 13:07:25.068834 4851 generic.go:334] "Generic (PLEG): container finished" podID="e3ef3f0b-665a-491b-857a-5ee2c5614f90" containerID="a3cc1b73e5f028abeaf844446a71bac5022e54b899919b289c774f2e4be608c2" exitCode=0 Oct 01 13:07:25 crc kubenswrapper[4851]: I1001 13:07:25.068942 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jcrr" event={"ID":"e3ef3f0b-665a-491b-857a-5ee2c5614f90","Type":"ContainerDied","Data":"a3cc1b73e5f028abeaf844446a71bac5022e54b899919b289c774f2e4be608c2"} Oct 01 13:07:26 crc kubenswrapper[4851]: I1001 13:07:26.081880 4851 generic.go:334] "Generic (PLEG): container finished" podID="e3ef3f0b-665a-491b-857a-5ee2c5614f90" containerID="7f32f4f996f4ef1284fc4670f3c3237c0b9ed481795fe9e258817ccb329e4b47" exitCode=0 Oct 01 13:07:26 crc kubenswrapper[4851]: I1001 13:07:26.081996 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jcrr" event={"ID":"e3ef3f0b-665a-491b-857a-5ee2c5614f90","Type":"ContainerDied","Data":"7f32f4f996f4ef1284fc4670f3c3237c0b9ed481795fe9e258817ccb329e4b47"} Oct 01 13:07:27 crc kubenswrapper[4851]: I1001 13:07:27.096360 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jcrr" event={"ID":"e3ef3f0b-665a-491b-857a-5ee2c5614f90","Type":"ContainerStarted","Data":"dd22efa0e706f2c736fdbe4ba76798378a60eb954604193ab2b61aad1db00132"} Oct 01 13:07:27 crc kubenswrapper[4851]: I1001 13:07:27.096684 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jcrr" event={"ID":"e3ef3f0b-665a-491b-857a-5ee2c5614f90","Type":"ContainerStarted","Data":"8564cfa6b2bae83fc4051d683bcb82ddc505058bfff198a78075b9a6c1601e69"} Oct 01 13:07:27 crc kubenswrapper[4851]: I1001 13:07:27.096697 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jcrr" event={"ID":"e3ef3f0b-665a-491b-857a-5ee2c5614f90","Type":"ContainerStarted","Data":"45669d45794e516b2f4a09438a4b413779688df3198a14170de0c3cf646cd762"} Oct 01 13:07:27 crc kubenswrapper[4851]: I1001 13:07:27.096708 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jcrr" event={"ID":"e3ef3f0b-665a-491b-857a-5ee2c5614f90","Type":"ContainerStarted","Data":"a6defd3f35a05c5db6b58df5c3f13b5408f7223e63416d98e93fb81ec05658cc"} Oct 01 13:07:27 crc kubenswrapper[4851]: I1001 13:07:27.586614 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-bgphr" Oct 01 13:07:28 crc kubenswrapper[4851]: I1001 13:07:28.118715 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jcrr" event={"ID":"e3ef3f0b-665a-491b-857a-5ee2c5614f90","Type":"ContainerStarted","Data":"3be834b9d901a29cc96857d1f1ea330f559ef130e246e4eafc9dad4e34b95b14"} Oct 01 13:07:28 crc kubenswrapper[4851]: I1001 13:07:28.118758 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jcrr" event={"ID":"e3ef3f0b-665a-491b-857a-5ee2c5614f90","Type":"ContainerStarted","Data":"2b2c5275cd630802b4100e5ec87e14722d4730d06c39f423e667bd3aac23cdfb"} Oct 01 13:07:28 crc kubenswrapper[4851]: I1001 13:07:28.118897 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:28 crc kubenswrapper[4851]: I1001 13:07:28.166206 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-8jcrr" podStartSLOduration=5.7894814 podStartE2EDuration="13.166180695s" podCreationTimestamp="2025-10-01 13:07:15 +0000 UTC" firstStartedPulling="2025-10-01 13:07:16.128579221 +0000 UTC m=+844.473696697" lastFinishedPulling="2025-10-01 13:07:23.505278506 +0000 UTC m=+851.850395992" observedRunningTime="2025-10-01 13:07:28.16286908 +0000 UTC m=+856.507986606" watchObservedRunningTime="2025-10-01 13:07:28.166180695 +0000 UTC m=+856.511298211" Oct 01 13:07:30 crc kubenswrapper[4851]: I1001 13:07:30.966655 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qcfdn"] Oct 01 13:07:30 crc kubenswrapper[4851]: I1001 13:07:30.967901 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qcfdn" Oct 01 13:07:30 crc kubenswrapper[4851]: I1001 13:07:30.970990 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 01 13:07:30 crc kubenswrapper[4851]: I1001 13:07:30.971203 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-vp299" Oct 01 13:07:30 crc kubenswrapper[4851]: I1001 13:07:30.973083 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 01 13:07:30 crc kubenswrapper[4851]: I1001 13:07:30.991405 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qcfdn"] Oct 01 13:07:31 crc kubenswrapper[4851]: I1001 13:07:31.006656 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:31 crc kubenswrapper[4851]: I1001 13:07:31.055530 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:31 crc kubenswrapper[4851]: I1001 13:07:31.088842 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7tl2\" (UniqueName: \"kubernetes.io/projected/5cca3f95-ff13-4027-ab94-f6581f1a2a23-kube-api-access-x7tl2\") pod \"openstack-operator-index-qcfdn\" (UID: \"5cca3f95-ff13-4027-ab94-f6581f1a2a23\") " pod="openstack-operators/openstack-operator-index-qcfdn" Oct 01 13:07:31 crc kubenswrapper[4851]: I1001 13:07:31.189971 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7tl2\" (UniqueName: \"kubernetes.io/projected/5cca3f95-ff13-4027-ab94-f6581f1a2a23-kube-api-access-x7tl2\") pod \"openstack-operator-index-qcfdn\" (UID: \"5cca3f95-ff13-4027-ab94-f6581f1a2a23\") " pod="openstack-operators/openstack-operator-index-qcfdn" Oct 01 13:07:31 crc kubenswrapper[4851]: I1001 13:07:31.210254 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7tl2\" (UniqueName: \"kubernetes.io/projected/5cca3f95-ff13-4027-ab94-f6581f1a2a23-kube-api-access-x7tl2\") pod \"openstack-operator-index-qcfdn\" (UID: \"5cca3f95-ff13-4027-ab94-f6581f1a2a23\") " pod="openstack-operators/openstack-operator-index-qcfdn" Oct 01 13:07:31 crc kubenswrapper[4851]: I1001 13:07:31.291478 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qcfdn" Oct 01 13:07:31 crc kubenswrapper[4851]: I1001 13:07:31.792580 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qcfdn"] Oct 01 13:07:32 crc kubenswrapper[4851]: I1001 13:07:32.160855 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qcfdn" event={"ID":"5cca3f95-ff13-4027-ab94-f6581f1a2a23","Type":"ContainerStarted","Data":"b88ae79a38541fc6bcb04f0266a6ae83beb7f5247b77c22ce8df91981060d47d"} Oct 01 13:07:34 crc kubenswrapper[4851]: I1001 13:07:34.336692 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qcfdn"] Oct 01 13:07:34 crc kubenswrapper[4851]: I1001 13:07:34.934423 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-d96ms"] Oct 01 13:07:34 crc kubenswrapper[4851]: I1001 13:07:34.935120 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-d96ms" Oct 01 13:07:34 crc kubenswrapper[4851]: I1001 13:07:34.952213 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-d96ms"] Oct 01 13:07:35 crc kubenswrapper[4851]: I1001 13:07:35.048550 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfz95\" (UniqueName: \"kubernetes.io/projected/8a61efd7-e35e-49f6-880c-e2d18b49d157-kube-api-access-bfz95\") pod \"openstack-operator-index-d96ms\" (UID: \"8a61efd7-e35e-49f6-880c-e2d18b49d157\") " pod="openstack-operators/openstack-operator-index-d96ms" Oct 01 13:07:35 crc kubenswrapper[4851]: I1001 13:07:35.149523 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfz95\" (UniqueName: \"kubernetes.io/projected/8a61efd7-e35e-49f6-880c-e2d18b49d157-kube-api-access-bfz95\") pod \"openstack-operator-index-d96ms\" (UID: \"8a61efd7-e35e-49f6-880c-e2d18b49d157\") " pod="openstack-operators/openstack-operator-index-d96ms" Oct 01 13:07:35 crc kubenswrapper[4851]: I1001 13:07:35.186187 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfz95\" (UniqueName: \"kubernetes.io/projected/8a61efd7-e35e-49f6-880c-e2d18b49d157-kube-api-access-bfz95\") pod \"openstack-operator-index-d96ms\" (UID: \"8a61efd7-e35e-49f6-880c-e2d18b49d157\") " pod="openstack-operators/openstack-operator-index-d96ms" Oct 01 13:07:35 crc kubenswrapper[4851]: I1001 13:07:35.258755 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-d96ms" Oct 01 13:07:35 crc kubenswrapper[4851]: I1001 13:07:35.736998 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-d96ms"] Oct 01 13:07:35 crc kubenswrapper[4851]: W1001 13:07:35.745218 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a61efd7_e35e_49f6_880c_e2d18b49d157.slice/crio-209cbc9fd20ac109fb39b65246445cc444b6366d107af3fb7f19f718e946514c WatchSource:0}: Error finding container 209cbc9fd20ac109fb39b65246445cc444b6366d107af3fb7f19f718e946514c: Status 404 returned error can't find the container with id 209cbc9fd20ac109fb39b65246445cc444b6366d107af3fb7f19f718e946514c Oct 01 13:07:36 crc kubenswrapper[4851]: I1001 13:07:36.009388 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-8jcrr" Oct 01 13:07:36 crc kubenswrapper[4851]: I1001 13:07:36.194102 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-d96ms" event={"ID":"8a61efd7-e35e-49f6-880c-e2d18b49d157","Type":"ContainerStarted","Data":"b7ce7b1cc2db94c07fb72db0552f462cfee2b903f73c12ae83a2f7f35242bc02"} Oct 01 13:07:36 crc kubenswrapper[4851]: I1001 13:07:36.194396 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-d96ms" event={"ID":"8a61efd7-e35e-49f6-880c-e2d18b49d157","Type":"ContainerStarted","Data":"209cbc9fd20ac109fb39b65246445cc444b6366d107af3fb7f19f718e946514c"} Oct 01 13:07:36 crc kubenswrapper[4851]: I1001 13:07:36.196238 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qcfdn" event={"ID":"5cca3f95-ff13-4027-ab94-f6581f1a2a23","Type":"ContainerStarted","Data":"94098470da3140c7db4a7e913ee740b5e6d2d193be57bb14fbe006176c876d2d"} Oct 01 13:07:36 crc kubenswrapper[4851]: I1001 13:07:36.196421 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-qcfdn" podUID="5cca3f95-ff13-4027-ab94-f6581f1a2a23" containerName="registry-server" containerID="cri-o://94098470da3140c7db4a7e913ee740b5e6d2d193be57bb14fbe006176c876d2d" gracePeriod=2 Oct 01 13:07:36 crc kubenswrapper[4851]: I1001 13:07:36.214774 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-d96ms" podStartSLOduration=2.167205321 podStartE2EDuration="2.214756282s" podCreationTimestamp="2025-10-01 13:07:34 +0000 UTC" firstStartedPulling="2025-10-01 13:07:35.751394452 +0000 UTC m=+864.096511938" lastFinishedPulling="2025-10-01 13:07:35.798945403 +0000 UTC m=+864.144062899" observedRunningTime="2025-10-01 13:07:36.212208099 +0000 UTC m=+864.557325595" watchObservedRunningTime="2025-10-01 13:07:36.214756282 +0000 UTC m=+864.559873778" Oct 01 13:07:36 crc kubenswrapper[4851]: I1001 13:07:36.238742 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qcfdn" podStartSLOduration=2.7679306329999998 podStartE2EDuration="6.238716352s" podCreationTimestamp="2025-10-01 13:07:30 +0000 UTC" firstStartedPulling="2025-10-01 13:07:31.806265942 +0000 UTC m=+860.151383468" lastFinishedPulling="2025-10-01 13:07:35.277051701 +0000 UTC m=+863.622169187" observedRunningTime="2025-10-01 13:07:36.235906482 +0000 UTC m=+864.581023978" watchObservedRunningTime="2025-10-01 13:07:36.238716352 +0000 UTC m=+864.583833868" Oct 01 13:07:36 crc kubenswrapper[4851]: I1001 13:07:36.632919 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-wqlkq" Oct 01 13:07:36 crc kubenswrapper[4851]: I1001 13:07:36.700933 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qcfdn" Oct 01 13:07:36 crc kubenswrapper[4851]: I1001 13:07:36.709292 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-wrnrn" Oct 01 13:07:36 crc kubenswrapper[4851]: I1001 13:07:36.775363 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7tl2\" (UniqueName: \"kubernetes.io/projected/5cca3f95-ff13-4027-ab94-f6581f1a2a23-kube-api-access-x7tl2\") pod \"5cca3f95-ff13-4027-ab94-f6581f1a2a23\" (UID: \"5cca3f95-ff13-4027-ab94-f6581f1a2a23\") " Oct 01 13:07:36 crc kubenswrapper[4851]: I1001 13:07:36.782659 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cca3f95-ff13-4027-ab94-f6581f1a2a23-kube-api-access-x7tl2" (OuterVolumeSpecName: "kube-api-access-x7tl2") pod "5cca3f95-ff13-4027-ab94-f6581f1a2a23" (UID: "5cca3f95-ff13-4027-ab94-f6581f1a2a23"). InnerVolumeSpecName "kube-api-access-x7tl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:07:36 crc kubenswrapper[4851]: I1001 13:07:36.877372 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7tl2\" (UniqueName: \"kubernetes.io/projected/5cca3f95-ff13-4027-ab94-f6581f1a2a23-kube-api-access-x7tl2\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:37 crc kubenswrapper[4851]: I1001 13:07:37.206518 4851 generic.go:334] "Generic (PLEG): container finished" podID="5cca3f95-ff13-4027-ab94-f6581f1a2a23" containerID="94098470da3140c7db4a7e913ee740b5e6d2d193be57bb14fbe006176c876d2d" exitCode=0 Oct 01 13:07:37 crc kubenswrapper[4851]: I1001 13:07:37.206604 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qcfdn" Oct 01 13:07:37 crc kubenswrapper[4851]: I1001 13:07:37.206649 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qcfdn" event={"ID":"5cca3f95-ff13-4027-ab94-f6581f1a2a23","Type":"ContainerDied","Data":"94098470da3140c7db4a7e913ee740b5e6d2d193be57bb14fbe006176c876d2d"} Oct 01 13:07:37 crc kubenswrapper[4851]: I1001 13:07:37.206735 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qcfdn" event={"ID":"5cca3f95-ff13-4027-ab94-f6581f1a2a23","Type":"ContainerDied","Data":"b88ae79a38541fc6bcb04f0266a6ae83beb7f5247b77c22ce8df91981060d47d"} Oct 01 13:07:37 crc kubenswrapper[4851]: I1001 13:07:37.206776 4851 scope.go:117] "RemoveContainer" containerID="94098470da3140c7db4a7e913ee740b5e6d2d193be57bb14fbe006176c876d2d" Oct 01 13:07:37 crc kubenswrapper[4851]: I1001 13:07:37.232918 4851 scope.go:117] "RemoveContainer" containerID="94098470da3140c7db4a7e913ee740b5e6d2d193be57bb14fbe006176c876d2d" Oct 01 13:07:37 crc kubenswrapper[4851]: E1001 13:07:37.233323 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94098470da3140c7db4a7e913ee740b5e6d2d193be57bb14fbe006176c876d2d\": container with ID starting with 94098470da3140c7db4a7e913ee740b5e6d2d193be57bb14fbe006176c876d2d not found: ID does not exist" containerID="94098470da3140c7db4a7e913ee740b5e6d2d193be57bb14fbe006176c876d2d" Oct 01 13:07:37 crc kubenswrapper[4851]: I1001 13:07:37.233359 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94098470da3140c7db4a7e913ee740b5e6d2d193be57bb14fbe006176c876d2d"} err="failed to get container status \"94098470da3140c7db4a7e913ee740b5e6d2d193be57bb14fbe006176c876d2d\": rpc error: code = NotFound desc = could not find container \"94098470da3140c7db4a7e913ee740b5e6d2d193be57bb14fbe006176c876d2d\": container with ID starting with 94098470da3140c7db4a7e913ee740b5e6d2d193be57bb14fbe006176c876d2d not found: ID does not exist" Oct 01 13:07:37 crc kubenswrapper[4851]: I1001 13:07:37.251643 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qcfdn"] Oct 01 13:07:37 crc kubenswrapper[4851]: I1001 13:07:37.260611 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-qcfdn"] Oct 01 13:07:38 crc kubenswrapper[4851]: I1001 13:07:38.343740 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cca3f95-ff13-4027-ab94-f6581f1a2a23" path="/var/lib/kubelet/pods/5cca3f95-ff13-4027-ab94-f6581f1a2a23/volumes" Oct 01 13:07:45 crc kubenswrapper[4851]: I1001 13:07:45.259157 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-d96ms" Oct 01 13:07:45 crc kubenswrapper[4851]: I1001 13:07:45.260162 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-d96ms" Oct 01 13:07:45 crc kubenswrapper[4851]: I1001 13:07:45.305275 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-d96ms" Oct 01 13:07:45 crc kubenswrapper[4851]: I1001 13:07:45.349424 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-d96ms" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.203918 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc"] Oct 01 13:07:46 crc kubenswrapper[4851]: E1001 13:07:46.204203 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cca3f95-ff13-4027-ab94-f6581f1a2a23" containerName="registry-server" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.204217 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cca3f95-ff13-4027-ab94-f6581f1a2a23" containerName="registry-server" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.204362 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cca3f95-ff13-4027-ab94-f6581f1a2a23" containerName="registry-server" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.205731 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.211052 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-9v259" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.218265 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc"] Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.344050 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2793519-7741-4285-b90b-2210f2c7a421-util\") pod \"4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc\" (UID: \"b2793519-7741-4285-b90b-2210f2c7a421\") " pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.344116 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq9bs\" (UniqueName: \"kubernetes.io/projected/b2793519-7741-4285-b90b-2210f2c7a421-kube-api-access-zq9bs\") pod \"4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc\" (UID: \"b2793519-7741-4285-b90b-2210f2c7a421\") " pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.344294 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2793519-7741-4285-b90b-2210f2c7a421-bundle\") pod \"4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc\" (UID: \"b2793519-7741-4285-b90b-2210f2c7a421\") " pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.446059 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2793519-7741-4285-b90b-2210f2c7a421-bundle\") pod \"4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc\" (UID: \"b2793519-7741-4285-b90b-2210f2c7a421\") " pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.446191 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2793519-7741-4285-b90b-2210f2c7a421-util\") pod \"4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc\" (UID: \"b2793519-7741-4285-b90b-2210f2c7a421\") " pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.446351 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq9bs\" (UniqueName: \"kubernetes.io/projected/b2793519-7741-4285-b90b-2210f2c7a421-kube-api-access-zq9bs\") pod \"4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc\" (UID: \"b2793519-7741-4285-b90b-2210f2c7a421\") " pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.446812 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2793519-7741-4285-b90b-2210f2c7a421-bundle\") pod \"4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc\" (UID: \"b2793519-7741-4285-b90b-2210f2c7a421\") " pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.446875 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2793519-7741-4285-b90b-2210f2c7a421-util\") pod \"4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc\" (UID: \"b2793519-7741-4285-b90b-2210f2c7a421\") " pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.471490 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq9bs\" (UniqueName: \"kubernetes.io/projected/b2793519-7741-4285-b90b-2210f2c7a421-kube-api-access-zq9bs\") pod \"4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc\" (UID: \"b2793519-7741-4285-b90b-2210f2c7a421\") " pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" Oct 01 13:07:46 crc kubenswrapper[4851]: I1001 13:07:46.538596 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" Oct 01 13:07:47 crc kubenswrapper[4851]: I1001 13:07:47.013604 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc"] Oct 01 13:07:47 crc kubenswrapper[4851]: W1001 13:07:47.020413 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2793519_7741_4285_b90b_2210f2c7a421.slice/crio-864ff96629d8e92036c5b9459893105f0bdffc8a0e44ebc35d88fd9c38a1b6b8 WatchSource:0}: Error finding container 864ff96629d8e92036c5b9459893105f0bdffc8a0e44ebc35d88fd9c38a1b6b8: Status 404 returned error can't find the container with id 864ff96629d8e92036c5b9459893105f0bdffc8a0e44ebc35d88fd9c38a1b6b8 Oct 01 13:07:47 crc kubenswrapper[4851]: I1001 13:07:47.295064 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" event={"ID":"b2793519-7741-4285-b90b-2210f2c7a421","Type":"ContainerStarted","Data":"4baaabd8253a8147b9a5c5949988403d237469b0c3bdf0e5db097637caf874cd"} Oct 01 13:07:47 crc kubenswrapper[4851]: I1001 13:07:47.295124 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" event={"ID":"b2793519-7741-4285-b90b-2210f2c7a421","Type":"ContainerStarted","Data":"864ff96629d8e92036c5b9459893105f0bdffc8a0e44ebc35d88fd9c38a1b6b8"} Oct 01 13:07:48 crc kubenswrapper[4851]: I1001 13:07:48.305721 4851 generic.go:334] "Generic (PLEG): container finished" podID="b2793519-7741-4285-b90b-2210f2c7a421" containerID="4baaabd8253a8147b9a5c5949988403d237469b0c3bdf0e5db097637caf874cd" exitCode=0 Oct 01 13:07:48 crc kubenswrapper[4851]: I1001 13:07:48.305784 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" event={"ID":"b2793519-7741-4285-b90b-2210f2c7a421","Type":"ContainerDied","Data":"4baaabd8253a8147b9a5c5949988403d237469b0c3bdf0e5db097637caf874cd"} Oct 01 13:07:49 crc kubenswrapper[4851]: I1001 13:07:49.320150 4851 generic.go:334] "Generic (PLEG): container finished" podID="b2793519-7741-4285-b90b-2210f2c7a421" containerID="9cd89ee98bb1aa5c1b05abbea503ea2e27dc4c7f4c27702d62cfb0ef874ff1de" exitCode=0 Oct 01 13:07:49 crc kubenswrapper[4851]: I1001 13:07:49.320251 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" event={"ID":"b2793519-7741-4285-b90b-2210f2c7a421","Type":"ContainerDied","Data":"9cd89ee98bb1aa5c1b05abbea503ea2e27dc4c7f4c27702d62cfb0ef874ff1de"} Oct 01 13:07:50 crc kubenswrapper[4851]: I1001 13:07:50.366773 4851 generic.go:334] "Generic (PLEG): container finished" podID="b2793519-7741-4285-b90b-2210f2c7a421" containerID="fa2dbec75e77d3625aed12741b6e6e8898d5aa46337f2b2641955578d9e126c6" exitCode=0 Oct 01 13:07:50 crc kubenswrapper[4851]: I1001 13:07:50.366879 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" event={"ID":"b2793519-7741-4285-b90b-2210f2c7a421","Type":"ContainerDied","Data":"fa2dbec75e77d3625aed12741b6e6e8898d5aa46337f2b2641955578d9e126c6"} Oct 01 13:07:51 crc kubenswrapper[4851]: I1001 13:07:51.739948 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" Oct 01 13:07:51 crc kubenswrapper[4851]: I1001 13:07:51.848096 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2793519-7741-4285-b90b-2210f2c7a421-bundle\") pod \"b2793519-7741-4285-b90b-2210f2c7a421\" (UID: \"b2793519-7741-4285-b90b-2210f2c7a421\") " Oct 01 13:07:51 crc kubenswrapper[4851]: I1001 13:07:51.848183 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2793519-7741-4285-b90b-2210f2c7a421-util\") pod \"b2793519-7741-4285-b90b-2210f2c7a421\" (UID: \"b2793519-7741-4285-b90b-2210f2c7a421\") " Oct 01 13:07:51 crc kubenswrapper[4851]: I1001 13:07:51.848216 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq9bs\" (UniqueName: \"kubernetes.io/projected/b2793519-7741-4285-b90b-2210f2c7a421-kube-api-access-zq9bs\") pod \"b2793519-7741-4285-b90b-2210f2c7a421\" (UID: \"b2793519-7741-4285-b90b-2210f2c7a421\") " Oct 01 13:07:51 crc kubenswrapper[4851]: I1001 13:07:51.850790 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2793519-7741-4285-b90b-2210f2c7a421-bundle" (OuterVolumeSpecName: "bundle") pod "b2793519-7741-4285-b90b-2210f2c7a421" (UID: "b2793519-7741-4285-b90b-2210f2c7a421"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:07:51 crc kubenswrapper[4851]: I1001 13:07:51.856305 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2793519-7741-4285-b90b-2210f2c7a421-kube-api-access-zq9bs" (OuterVolumeSpecName: "kube-api-access-zq9bs") pod "b2793519-7741-4285-b90b-2210f2c7a421" (UID: "b2793519-7741-4285-b90b-2210f2c7a421"). InnerVolumeSpecName "kube-api-access-zq9bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:07:51 crc kubenswrapper[4851]: I1001 13:07:51.862909 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2793519-7741-4285-b90b-2210f2c7a421-util" (OuterVolumeSpecName: "util") pod "b2793519-7741-4285-b90b-2210f2c7a421" (UID: "b2793519-7741-4285-b90b-2210f2c7a421"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:07:51 crc kubenswrapper[4851]: I1001 13:07:51.950451 4851 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2793519-7741-4285-b90b-2210f2c7a421-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:51 crc kubenswrapper[4851]: I1001 13:07:51.950488 4851 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2793519-7741-4285-b90b-2210f2c7a421-util\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:51 crc kubenswrapper[4851]: I1001 13:07:51.950518 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq9bs\" (UniqueName: \"kubernetes.io/projected/b2793519-7741-4285-b90b-2210f2c7a421-kube-api-access-zq9bs\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:52 crc kubenswrapper[4851]: I1001 13:07:52.388315 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" event={"ID":"b2793519-7741-4285-b90b-2210f2c7a421","Type":"ContainerDied","Data":"864ff96629d8e92036c5b9459893105f0bdffc8a0e44ebc35d88fd9c38a1b6b8"} Oct 01 13:07:52 crc kubenswrapper[4851]: I1001 13:07:52.388988 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="864ff96629d8e92036c5b9459893105f0bdffc8a0e44ebc35d88fd9c38a1b6b8" Oct 01 13:07:52 crc kubenswrapper[4851]: I1001 13:07:52.388373 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc" Oct 01 13:08:00 crc kubenswrapper[4851]: I1001 13:08:00.050341 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:08:00 crc kubenswrapper[4851]: I1001 13:08:00.051019 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:08:09 crc kubenswrapper[4851]: I1001 13:08:09.109106 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-548cc7d4f7-ll79v"] Oct 01 13:08:09 crc kubenswrapper[4851]: E1001 13:08:09.110079 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2793519-7741-4285-b90b-2210f2c7a421" containerName="util" Oct 01 13:08:09 crc kubenswrapper[4851]: I1001 13:08:09.110100 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2793519-7741-4285-b90b-2210f2c7a421" containerName="util" Oct 01 13:08:09 crc kubenswrapper[4851]: E1001 13:08:09.110130 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2793519-7741-4285-b90b-2210f2c7a421" containerName="pull" Oct 01 13:08:09 crc kubenswrapper[4851]: I1001 13:08:09.110142 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2793519-7741-4285-b90b-2210f2c7a421" containerName="pull" Oct 01 13:08:09 crc kubenswrapper[4851]: E1001 13:08:09.110163 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2793519-7741-4285-b90b-2210f2c7a421" containerName="extract" Oct 01 13:08:09 crc kubenswrapper[4851]: I1001 13:08:09.110176 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2793519-7741-4285-b90b-2210f2c7a421" containerName="extract" Oct 01 13:08:09 crc kubenswrapper[4851]: I1001 13:08:09.110374 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2793519-7741-4285-b90b-2210f2c7a421" containerName="extract" Oct 01 13:08:09 crc kubenswrapper[4851]: I1001 13:08:09.111385 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-548cc7d4f7-ll79v" Oct 01 13:08:09 crc kubenswrapper[4851]: I1001 13:08:09.114020 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-xksqm" Oct 01 13:08:09 crc kubenswrapper[4851]: I1001 13:08:09.135800 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-548cc7d4f7-ll79v"] Oct 01 13:08:09 crc kubenswrapper[4851]: I1001 13:08:09.208129 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzczt\" (UniqueName: \"kubernetes.io/projected/fdc19a2b-5602-40b0-a49d-22d56b9724d7-kube-api-access-xzczt\") pod \"openstack-operator-controller-operator-548cc7d4f7-ll79v\" (UID: \"fdc19a2b-5602-40b0-a49d-22d56b9724d7\") " pod="openstack-operators/openstack-operator-controller-operator-548cc7d4f7-ll79v" Oct 01 13:08:09 crc kubenswrapper[4851]: I1001 13:08:09.309022 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzczt\" (UniqueName: \"kubernetes.io/projected/fdc19a2b-5602-40b0-a49d-22d56b9724d7-kube-api-access-xzczt\") pod \"openstack-operator-controller-operator-548cc7d4f7-ll79v\" (UID: \"fdc19a2b-5602-40b0-a49d-22d56b9724d7\") " pod="openstack-operators/openstack-operator-controller-operator-548cc7d4f7-ll79v" Oct 01 13:08:09 crc kubenswrapper[4851]: I1001 13:08:09.332526 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzczt\" (UniqueName: \"kubernetes.io/projected/fdc19a2b-5602-40b0-a49d-22d56b9724d7-kube-api-access-xzczt\") pod \"openstack-operator-controller-operator-548cc7d4f7-ll79v\" (UID: \"fdc19a2b-5602-40b0-a49d-22d56b9724d7\") " pod="openstack-operators/openstack-operator-controller-operator-548cc7d4f7-ll79v" Oct 01 13:08:09 crc kubenswrapper[4851]: I1001 13:08:09.437018 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-548cc7d4f7-ll79v" Oct 01 13:08:09 crc kubenswrapper[4851]: I1001 13:08:09.743491 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-548cc7d4f7-ll79v"] Oct 01 13:08:09 crc kubenswrapper[4851]: I1001 13:08:09.754530 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:08:10 crc kubenswrapper[4851]: I1001 13:08:10.552915 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-548cc7d4f7-ll79v" event={"ID":"fdc19a2b-5602-40b0-a49d-22d56b9724d7","Type":"ContainerStarted","Data":"7d8536b26c7dc0e37a0c1e0c4af077f79df2481d0b32e3d81b3610d51fda2a57"} Oct 01 13:08:14 crc kubenswrapper[4851]: I1001 13:08:14.583392 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-548cc7d4f7-ll79v" event={"ID":"fdc19a2b-5602-40b0-a49d-22d56b9724d7","Type":"ContainerStarted","Data":"be72ded388f40f998f9ef380687af993a5e767ba7b832863ad820a509b8df86a"} Oct 01 13:08:16 crc kubenswrapper[4851]: I1001 13:08:16.598391 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-548cc7d4f7-ll79v" event={"ID":"fdc19a2b-5602-40b0-a49d-22d56b9724d7","Type":"ContainerStarted","Data":"48671384c60ef8c50fa8786b1b5078151fadf6c819475b64a4f866a416b5f51d"} Oct 01 13:08:16 crc kubenswrapper[4851]: I1001 13:08:16.598920 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-548cc7d4f7-ll79v" Oct 01 13:08:19 crc kubenswrapper[4851]: I1001 13:08:19.441589 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-548cc7d4f7-ll79v" Oct 01 13:08:19 crc kubenswrapper[4851]: I1001 13:08:19.493087 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-548cc7d4f7-ll79v" podStartSLOduration=4.230014998 podStartE2EDuration="10.493073808s" podCreationTimestamp="2025-10-01 13:08:09 +0000 UTC" firstStartedPulling="2025-10-01 13:08:09.754266468 +0000 UTC m=+898.099383954" lastFinishedPulling="2025-10-01 13:08:16.017325268 +0000 UTC m=+904.362442764" observedRunningTime="2025-10-01 13:08:16.64386669 +0000 UTC m=+904.988984206" watchObservedRunningTime="2025-10-01 13:08:19.493073808 +0000 UTC m=+907.838191294" Oct 01 13:08:30 crc kubenswrapper[4851]: I1001 13:08:30.050214 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:08:30 crc kubenswrapper[4851]: I1001 13:08:30.050940 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.713293 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-nxjn7"] Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.715763 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nxjn7" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.719372 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-f269c" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.731194 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-ms6rc"] Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.732457 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-ms6rc" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.735625 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-dkwpj" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.737143 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-nxjn7"] Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.753671 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-ms6rc"] Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.770677 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-knmsq"] Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.771723 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-knmsq" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.774649 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-5vljk" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.781079 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-8c9zh"] Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.782317 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8c9zh" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.784595 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-vtqx7" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.794564 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-knmsq"] Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.819015 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-8c9zh"] Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.823247 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2vlj\" (UniqueName: \"kubernetes.io/projected/613abfd6-27d5-4f52-bad5-024d71335465-kube-api-access-z2vlj\") pod \"barbican-operator-controller-manager-6ff8b75857-nxjn7\" (UID: \"613abfd6-27d5-4f52-bad5-024d71335465\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nxjn7" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.823296 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2s65\" (UniqueName: \"kubernetes.io/projected/08cf0ffd-5ec4-4406-a912-319a1c9ced15-kube-api-access-v2s65\") pod \"cinder-operator-controller-manager-644bddb6d8-ms6rc\" (UID: \"08cf0ffd-5ec4-4406-a912-319a1c9ced15\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-ms6rc" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.828863 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-r7gqw"] Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.829890 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-r7gqw" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.832078 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-62g2m" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.837572 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp"] Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.842763 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.845364 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-sr7p5" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.872232 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp"] Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.922932 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-r7gqw"] Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.924203 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbxcq\" (UniqueName: \"kubernetes.io/projected/4dcb48dd-5baf-415e-861e-ebfa40fc2e84-kube-api-access-rbxcq\") pod \"designate-operator-controller-manager-84f4f7b77b-knmsq\" (UID: \"4dcb48dd-5baf-415e-861e-ebfa40fc2e84\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-knmsq" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.924268 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42v78\" (UniqueName: \"kubernetes.io/projected/ba805aa7-4c6e-4dc1-8de8-c935ab1c2128-kube-api-access-42v78\") pod \"heat-operator-controller-manager-5d889d78cf-r7gqw\" (UID: \"ba805aa7-4c6e-4dc1-8de8-c935ab1c2128\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-r7gqw" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.924295 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh8j7\" (UniqueName: \"kubernetes.io/projected/a1e24a95-87b4-4f06-b651-9c8c26a7021d-kube-api-access-bh8j7\") pod \"glance-operator-controller-manager-84958c4d49-8c9zh\" (UID: \"a1e24a95-87b4-4f06-b651-9c8c26a7021d\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8c9zh" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.924318 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2vlj\" (UniqueName: \"kubernetes.io/projected/613abfd6-27d5-4f52-bad5-024d71335465-kube-api-access-z2vlj\") pod \"barbican-operator-controller-manager-6ff8b75857-nxjn7\" (UID: \"613abfd6-27d5-4f52-bad5-024d71335465\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nxjn7" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.924342 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9dbr\" (UniqueName: \"kubernetes.io/projected/a01e01ff-d00e-482d-8901-72c0705672f1-kube-api-access-p9dbr\") pod \"horizon-operator-controller-manager-9f4696d94-tssnp\" (UID: \"a01e01ff-d00e-482d-8901-72c0705672f1\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.924362 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2s65\" (UniqueName: \"kubernetes.io/projected/08cf0ffd-5ec4-4406-a912-319a1c9ced15-kube-api-access-v2s65\") pod \"cinder-operator-controller-manager-644bddb6d8-ms6rc\" (UID: \"08cf0ffd-5ec4-4406-a912-319a1c9ced15\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-ms6rc" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.941576 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf"] Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.942619 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.946159 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2vlj\" (UniqueName: \"kubernetes.io/projected/613abfd6-27d5-4f52-bad5-024d71335465-kube-api-access-z2vlj\") pod \"barbican-operator-controller-manager-6ff8b75857-nxjn7\" (UID: \"613abfd6-27d5-4f52-bad5-024d71335465\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nxjn7" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.946463 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.946648 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2mkgf" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.968364 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2s65\" (UniqueName: \"kubernetes.io/projected/08cf0ffd-5ec4-4406-a912-319a1c9ced15-kube-api-access-v2s65\") pod \"cinder-operator-controller-manager-644bddb6d8-ms6rc\" (UID: \"08cf0ffd-5ec4-4406-a912-319a1c9ced15\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-ms6rc" Oct 01 13:08:50 crc kubenswrapper[4851]: I1001 13:08:50.991181 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.004568 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-dq52c"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.005624 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-dq52c" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.008523 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-flp6z"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.009623 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-flp6z" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.009712 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-w6mxv" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.014056 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-jmbjz" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.025475 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42v78\" (UniqueName: \"kubernetes.io/projected/ba805aa7-4c6e-4dc1-8de8-c935ab1c2128-kube-api-access-42v78\") pod \"heat-operator-controller-manager-5d889d78cf-r7gqw\" (UID: \"ba805aa7-4c6e-4dc1-8de8-c935ab1c2128\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-r7gqw" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.025524 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63075165-70d9-4dd4-9b52-41e59e59fcee-cert\") pod \"infra-operator-controller-manager-9d6c5db85-4chzf\" (UID: \"63075165-70d9-4dd4-9b52-41e59e59fcee\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.025554 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh8j7\" (UniqueName: \"kubernetes.io/projected/a1e24a95-87b4-4f06-b651-9c8c26a7021d-kube-api-access-bh8j7\") pod \"glance-operator-controller-manager-84958c4d49-8c9zh\" (UID: \"a1e24a95-87b4-4f06-b651-9c8c26a7021d\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8c9zh" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.025577 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9dbr\" (UniqueName: \"kubernetes.io/projected/a01e01ff-d00e-482d-8901-72c0705672f1-kube-api-access-p9dbr\") pod \"horizon-operator-controller-manager-9f4696d94-tssnp\" (UID: \"a01e01ff-d00e-482d-8901-72c0705672f1\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.025609 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dzf7\" (UniqueName: \"kubernetes.io/projected/63075165-70d9-4dd4-9b52-41e59e59fcee-kube-api-access-7dzf7\") pod \"infra-operator-controller-manager-9d6c5db85-4chzf\" (UID: \"63075165-70d9-4dd4-9b52-41e59e59fcee\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.025659 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbxcq\" (UniqueName: \"kubernetes.io/projected/4dcb48dd-5baf-415e-861e-ebfa40fc2e84-kube-api-access-rbxcq\") pod \"designate-operator-controller-manager-84f4f7b77b-knmsq\" (UID: \"4dcb48dd-5baf-415e-861e-ebfa40fc2e84\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-knmsq" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.044179 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nxjn7" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.050471 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh8j7\" (UniqueName: \"kubernetes.io/projected/a1e24a95-87b4-4f06-b651-9c8c26a7021d-kube-api-access-bh8j7\") pod \"glance-operator-controller-manager-84958c4d49-8c9zh\" (UID: \"a1e24a95-87b4-4f06-b651-9c8c26a7021d\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8c9zh" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.052598 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.053628 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.054626 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-ms6rc" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.054949 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbxcq\" (UniqueName: \"kubernetes.io/projected/4dcb48dd-5baf-415e-861e-ebfa40fc2e84-kube-api-access-rbxcq\") pod \"designate-operator-controller-manager-84f4f7b77b-knmsq\" (UID: \"4dcb48dd-5baf-415e-861e-ebfa40fc2e84\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-knmsq" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.058061 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-dq52c"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.058625 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-tg7rg" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.059184 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9dbr\" (UniqueName: \"kubernetes.io/projected/a01e01ff-d00e-482d-8901-72c0705672f1-kube-api-access-p9dbr\") pod \"horizon-operator-controller-manager-9f4696d94-tssnp\" (UID: \"a01e01ff-d00e-482d-8901-72c0705672f1\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.066499 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42v78\" (UniqueName: \"kubernetes.io/projected/ba805aa7-4c6e-4dc1-8de8-c935ab1c2128-kube-api-access-42v78\") pod \"heat-operator-controller-manager-5d889d78cf-r7gqw\" (UID: \"ba805aa7-4c6e-4dc1-8de8-c935ab1c2128\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-r7gqw" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.073778 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-flp6z"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.075487 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.083691 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-qjqpv"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.084934 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-qjqpv"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.085012 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qjqpv" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.085671 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-knmsq" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.085984 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.086784 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-f28lg" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.086975 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.091002 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-j485f" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.092882 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.097320 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.107614 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.108040 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8c9zh" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.108839 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-478nv" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.116852 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.124318 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.125422 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.127071 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-cfxd7" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.128569 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63075165-70d9-4dd4-9b52-41e59e59fcee-cert\") pod \"infra-operator-controller-manager-9d6c5db85-4chzf\" (UID: \"63075165-70d9-4dd4-9b52-41e59e59fcee\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.128637 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbsmc\" (UniqueName: \"kubernetes.io/projected/ac9e1ccb-a68d-446e-b47b-de00d828f332-kube-api-access-qbsmc\") pod \"ironic-operator-controller-manager-5cd4858477-dq52c\" (UID: \"ac9e1ccb-a68d-446e-b47b-de00d828f332\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-dq52c" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.128687 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dzf7\" (UniqueName: \"kubernetes.io/projected/63075165-70d9-4dd4-9b52-41e59e59fcee-kube-api-access-7dzf7\") pod \"infra-operator-controller-manager-9d6c5db85-4chzf\" (UID: \"63075165-70d9-4dd4-9b52-41e59e59fcee\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.128711 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48m2x\" (UniqueName: \"kubernetes.io/projected/a7cf8be8-c21f-4904-838e-f185857ef960-kube-api-access-48m2x\") pod \"keystone-operator-controller-manager-5bd55b4bff-flp6z\" (UID: \"a7cf8be8-c21f-4904-838e-f185857ef960\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-flp6z" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.128763 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89b62\" (UniqueName: \"kubernetes.io/projected/324c39a8-85b5-4caf-a719-a5f47a827d08-kube-api-access-89b62\") pod \"manila-operator-controller-manager-6d68dbc695-zx88h\" (UID: \"324c39a8-85b5-4caf-a719-a5f47a827d08\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h" Oct 01 13:08:51 crc kubenswrapper[4851]: E1001 13:08:51.128925 4851 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 01 13:08:51 crc kubenswrapper[4851]: E1001 13:08:51.128974 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63075165-70d9-4dd4-9b52-41e59e59fcee-cert podName:63075165-70d9-4dd4-9b52-41e59e59fcee nodeName:}" failed. No retries permitted until 2025-10-01 13:08:51.628956501 +0000 UTC m=+939.974073987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63075165-70d9-4dd4-9b52-41e59e59fcee-cert") pod "infra-operator-controller-manager-9d6c5db85-4chzf" (UID: "63075165-70d9-4dd4-9b52-41e59e59fcee") : secret "infra-operator-webhook-server-cert" not found Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.129303 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.133072 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.134047 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.137311 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.139391 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-gfsq7" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.143324 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.145050 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.146269 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-cslt7"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.147033 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-cslt7" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.150959 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-6l992" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.151181 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-kt8gd" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.151295 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.152681 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-cslt7"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.156921 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.163613 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dzf7\" (UniqueName: \"kubernetes.io/projected/63075165-70d9-4dd4-9b52-41e59e59fcee-kube-api-access-7dzf7\") pod \"infra-operator-controller-manager-9d6c5db85-4chzf\" (UID: \"63075165-70d9-4dd4-9b52-41e59e59fcee\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.165657 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.166781 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.169845 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.170855 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rlxcq" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.177638 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-r7gqw" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.200672 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.201827 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.203705 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-mbmxs" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.205130 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.228080 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.230118 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbsmc\" (UniqueName: \"kubernetes.io/projected/ac9e1ccb-a68d-446e-b47b-de00d828f332-kube-api-access-qbsmc\") pod \"ironic-operator-controller-manager-5cd4858477-dq52c\" (UID: \"ac9e1ccb-a68d-446e-b47b-de00d828f332\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-dq52c" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.230162 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vbvj\" (UniqueName: \"kubernetes.io/projected/9ac4cdbb-6f06-4525-8f5a-61f81a230708-kube-api-access-2vbvj\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8ctmstz\" (UID: \"9ac4cdbb-6f06-4525-8f5a-61f81a230708\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.230189 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l56cs\" (UniqueName: \"kubernetes.io/projected/7abbaca1-f067-43b8-a24e-0219ce7e7eaa-kube-api-access-l56cs\") pod \"nova-operator-controller-manager-64cd67b5cb-mkxn2\" (UID: \"7abbaca1-f067-43b8-a24e-0219ce7e7eaa\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.230215 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48m2x\" (UniqueName: \"kubernetes.io/projected/a7cf8be8-c21f-4904-838e-f185857ef960-kube-api-access-48m2x\") pod \"keystone-operator-controller-manager-5bd55b4bff-flp6z\" (UID: \"a7cf8be8-c21f-4904-838e-f185857ef960\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-flp6z" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.230236 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks6ws\" (UniqueName: \"kubernetes.io/projected/9389dca9-fd43-4f83-a3a8-b755859b252e-kube-api-access-ks6ws\") pod \"swift-operator-controller-manager-84d6b4b759-62xmc\" (UID: \"9389dca9-fd43-4f83-a3a8-b755859b252e\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.230261 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89b62\" (UniqueName: \"kubernetes.io/projected/324c39a8-85b5-4caf-a719-a5f47a827d08-kube-api-access-89b62\") pod \"manila-operator-controller-manager-6d68dbc695-zx88h\" (UID: \"324c39a8-85b5-4caf-a719-a5f47a827d08\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.230297 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8slw\" (UniqueName: \"kubernetes.io/projected/5d973d57-3aa0-4d14-9c4a-435f6ff880af-kube-api-access-j8slw\") pod \"neutron-operator-controller-manager-849d5b9b84-22n8s\" (UID: \"5d973d57-3aa0-4d14-9c4a-435f6ff880af\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.230316 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcjtk\" (UniqueName: \"kubernetes.io/projected/de177c71-0c78-416b-a62f-2d73d86a2b70-kube-api-access-fcjtk\") pod \"mariadb-operator-controller-manager-88c7-qjqpv\" (UID: \"de177c71-0c78-416b-a62f-2d73d86a2b70\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-qjqpv" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.230333 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfq84\" (UniqueName: \"kubernetes.io/projected/d7ddf969-982c-4436-85e4-fb963d57a385-kube-api-access-wfq84\") pod \"placement-operator-controller-manager-589c58c6c-cslt7\" (UID: \"d7ddf969-982c-4436-85e4-fb963d57a385\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-cslt7" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.230351 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9bvq\" (UniqueName: \"kubernetes.io/projected/ee6067ef-427a-49d8-99b6-694930e44a0d-kube-api-access-f9bvq\") pod \"octavia-operator-controller-manager-7b787867f4-qxlj6\" (UID: \"ee6067ef-427a-49d8-99b6-694930e44a0d\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.231475 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv6hb\" (UniqueName: \"kubernetes.io/projected/f3be6466-a46d-49b7-a5e4-9465c82ce165-kube-api-access-fv6hb\") pod \"ovn-operator-controller-manager-9976ff44c-c8r2z\" (UID: \"f3be6466-a46d-49b7-a5e4-9465c82ce165\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.231524 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ac4cdbb-6f06-4525-8f5a-61f81a230708-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8ctmstz\" (UID: \"9ac4cdbb-6f06-4525-8f5a-61f81a230708\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.263572 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48m2x\" (UniqueName: \"kubernetes.io/projected/a7cf8be8-c21f-4904-838e-f185857ef960-kube-api-access-48m2x\") pod \"keystone-operator-controller-manager-5bd55b4bff-flp6z\" (UID: \"a7cf8be8-c21f-4904-838e-f185857ef960\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-flp6z" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.267437 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89b62\" (UniqueName: \"kubernetes.io/projected/324c39a8-85b5-4caf-a719-a5f47a827d08-kube-api-access-89b62\") pod \"manila-operator-controller-manager-6d68dbc695-zx88h\" (UID: \"324c39a8-85b5-4caf-a719-a5f47a827d08\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.268706 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbsmc\" (UniqueName: \"kubernetes.io/projected/ac9e1ccb-a68d-446e-b47b-de00d828f332-kube-api-access-qbsmc\") pod \"ironic-operator-controller-manager-5cd4858477-dq52c\" (UID: \"ac9e1ccb-a68d-446e-b47b-de00d828f332\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-dq52c" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.332677 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-dq52c" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.335907 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8slw\" (UniqueName: \"kubernetes.io/projected/5d973d57-3aa0-4d14-9c4a-435f6ff880af-kube-api-access-j8slw\") pod \"neutron-operator-controller-manager-849d5b9b84-22n8s\" (UID: \"5d973d57-3aa0-4d14-9c4a-435f6ff880af\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.335944 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcjtk\" (UniqueName: \"kubernetes.io/projected/de177c71-0c78-416b-a62f-2d73d86a2b70-kube-api-access-fcjtk\") pod \"mariadb-operator-controller-manager-88c7-qjqpv\" (UID: \"de177c71-0c78-416b-a62f-2d73d86a2b70\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-qjqpv" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.335968 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfq84\" (UniqueName: \"kubernetes.io/projected/d7ddf969-982c-4436-85e4-fb963d57a385-kube-api-access-wfq84\") pod \"placement-operator-controller-manager-589c58c6c-cslt7\" (UID: \"d7ddf969-982c-4436-85e4-fb963d57a385\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-cslt7" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.335987 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9bvq\" (UniqueName: \"kubernetes.io/projected/ee6067ef-427a-49d8-99b6-694930e44a0d-kube-api-access-f9bvq\") pod \"octavia-operator-controller-manager-7b787867f4-qxlj6\" (UID: \"ee6067ef-427a-49d8-99b6-694930e44a0d\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.336047 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv6hb\" (UniqueName: \"kubernetes.io/projected/f3be6466-a46d-49b7-a5e4-9465c82ce165-kube-api-access-fv6hb\") pod \"ovn-operator-controller-manager-9976ff44c-c8r2z\" (UID: \"f3be6466-a46d-49b7-a5e4-9465c82ce165\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.336080 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ac4cdbb-6f06-4525-8f5a-61f81a230708-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8ctmstz\" (UID: \"9ac4cdbb-6f06-4525-8f5a-61f81a230708\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.336134 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vbvj\" (UniqueName: \"kubernetes.io/projected/9ac4cdbb-6f06-4525-8f5a-61f81a230708-kube-api-access-2vbvj\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8ctmstz\" (UID: \"9ac4cdbb-6f06-4525-8f5a-61f81a230708\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.336162 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l56cs\" (UniqueName: \"kubernetes.io/projected/7abbaca1-f067-43b8-a24e-0219ce7e7eaa-kube-api-access-l56cs\") pod \"nova-operator-controller-manager-64cd67b5cb-mkxn2\" (UID: \"7abbaca1-f067-43b8-a24e-0219ce7e7eaa\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.336216 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks6ws\" (UniqueName: \"kubernetes.io/projected/9389dca9-fd43-4f83-a3a8-b755859b252e-kube-api-access-ks6ws\") pod \"swift-operator-controller-manager-84d6b4b759-62xmc\" (UID: \"9389dca9-fd43-4f83-a3a8-b755859b252e\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.336318 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7g57\" (UniqueName: \"kubernetes.io/projected/c9b7da57-564a-4053-a476-08db0d87c317-kube-api-access-q7g57\") pod \"telemetry-operator-controller-manager-b8d54b5d7-zzts5\" (UID: \"c9b7da57-564a-4053-a476-08db0d87c317\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5" Oct 01 13:08:51 crc kubenswrapper[4851]: E1001 13:08:51.336723 4851 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 13:08:51 crc kubenswrapper[4851]: E1001 13:08:51.336760 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac4cdbb-6f06-4525-8f5a-61f81a230708-cert podName:9ac4cdbb-6f06-4525-8f5a-61f81a230708 nodeName:}" failed. No retries permitted until 2025-10-01 13:08:51.836746682 +0000 UTC m=+940.181864168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ac4cdbb-6f06-4525-8f5a-61f81a230708-cert") pod "openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" (UID: "9ac4cdbb-6f06-4525-8f5a-61f81a230708") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.350663 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-62wb7"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.352367 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-62wb7" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.353692 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-flp6z" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.358092 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9bvq\" (UniqueName: \"kubernetes.io/projected/ee6067ef-427a-49d8-99b6-694930e44a0d-kube-api-access-f9bvq\") pod \"octavia-operator-controller-manager-7b787867f4-qxlj6\" (UID: \"ee6067ef-427a-49d8-99b6-694930e44a0d\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.358867 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-62wb7"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.368340 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-fjrjs" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.394887 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv6hb\" (UniqueName: \"kubernetes.io/projected/f3be6466-a46d-49b7-a5e4-9465c82ce165-kube-api-access-fv6hb\") pod \"ovn-operator-controller-manager-9976ff44c-c8r2z\" (UID: \"f3be6466-a46d-49b7-a5e4-9465c82ce165\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.394979 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcjtk\" (UniqueName: \"kubernetes.io/projected/de177c71-0c78-416b-a62f-2d73d86a2b70-kube-api-access-fcjtk\") pod \"mariadb-operator-controller-manager-88c7-qjqpv\" (UID: \"de177c71-0c78-416b-a62f-2d73d86a2b70\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-qjqpv" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.396319 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks6ws\" (UniqueName: \"kubernetes.io/projected/9389dca9-fd43-4f83-a3a8-b755859b252e-kube-api-access-ks6ws\") pod \"swift-operator-controller-manager-84d6b4b759-62xmc\" (UID: \"9389dca9-fd43-4f83-a3a8-b755859b252e\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.406897 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfq84\" (UniqueName: \"kubernetes.io/projected/d7ddf969-982c-4436-85e4-fb963d57a385-kube-api-access-wfq84\") pod \"placement-operator-controller-manager-589c58c6c-cslt7\" (UID: \"d7ddf969-982c-4436-85e4-fb963d57a385\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-cslt7" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.407894 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.408942 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8slw\" (UniqueName: \"kubernetes.io/projected/5d973d57-3aa0-4d14-9c4a-435f6ff880af-kube-api-access-j8slw\") pod \"neutron-operator-controller-manager-849d5b9b84-22n8s\" (UID: \"5d973d57-3aa0-4d14-9c4a-435f6ff880af\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.409255 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.410754 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l56cs\" (UniqueName: \"kubernetes.io/projected/7abbaca1-f067-43b8-a24e-0219ce7e7eaa-kube-api-access-l56cs\") pod \"nova-operator-controller-manager-64cd67b5cb-mkxn2\" (UID: \"7abbaca1-f067-43b8-a24e-0219ce7e7eaa\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.411446 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vbvj\" (UniqueName: \"kubernetes.io/projected/9ac4cdbb-6f06-4525-8f5a-61f81a230708-kube-api-access-2vbvj\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8ctmstz\" (UID: \"9ac4cdbb-6f06-4525-8f5a-61f81a230708\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.417231 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-k5zdt" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.420465 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.437298 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qstc2\" (UniqueName: \"kubernetes.io/projected/5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0-kube-api-access-qstc2\") pod \"test-operator-controller-manager-85777745bb-62wb7\" (UID: \"5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-62wb7" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.437350 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7g57\" (UniqueName: \"kubernetes.io/projected/c9b7da57-564a-4053-a476-08db0d87c317-kube-api-access-q7g57\") pod \"telemetry-operator-controller-manager-b8d54b5d7-zzts5\" (UID: \"c9b7da57-564a-4053-a476-08db0d87c317\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.438999 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.440092 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.442098 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vdvsk" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.442294 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.451302 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.456598 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.456784 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7g57\" (UniqueName: \"kubernetes.io/projected/c9b7da57-564a-4053-a476-08db0d87c317-kube-api-access-q7g57\") pod \"telemetry-operator-controller-manager-b8d54b5d7-zzts5\" (UID: \"c9b7da57-564a-4053-a476-08db0d87c317\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.469588 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qjqpv" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.487897 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.488461 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.512682 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.517941 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.528299 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-26s54"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.529611 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-26s54" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.535955 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-cf6nr" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.538819 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.540149 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6e332ad-4dcd-4526-abf5-c79bfce4ee72-cert\") pod \"openstack-operator-controller-manager-5cc886c7f9-x6rkr\" (UID: \"e6e332ad-4dcd-4526-abf5-c79bfce4ee72\") " pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.540271 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq2nl\" (UniqueName: \"kubernetes.io/projected/e6e332ad-4dcd-4526-abf5-c79bfce4ee72-kube-api-access-qq2nl\") pod \"openstack-operator-controller-manager-5cc886c7f9-x6rkr\" (UID: \"e6e332ad-4dcd-4526-abf5-c79bfce4ee72\") " pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.540322 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qstc2\" (UniqueName: \"kubernetes.io/projected/5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0-kube-api-access-qstc2\") pod \"test-operator-controller-manager-85777745bb-62wb7\" (UID: \"5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-62wb7" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.540423 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs24h\" (UniqueName: \"kubernetes.io/projected/60b302ff-f344-4556-92dc-e8f7954b80c9-kube-api-access-qs24h\") pod \"watcher-operator-controller-manager-d64f8f9f6-2qx7f\" (UID: \"60b302ff-f344-4556-92dc-e8f7954b80c9\") " pod="openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.568582 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-26s54"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.574072 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.574165 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qstc2\" (UniqueName: \"kubernetes.io/projected/5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0-kube-api-access-qstc2\") pod \"test-operator-controller-manager-85777745bb-62wb7\" (UID: \"5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-62wb7" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.604499 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-nxjn7"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.631932 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-cslt7" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.646389 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63075165-70d9-4dd4-9b52-41e59e59fcee-cert\") pod \"infra-operator-controller-manager-9d6c5db85-4chzf\" (UID: \"63075165-70d9-4dd4-9b52-41e59e59fcee\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.646446 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs24h\" (UniqueName: \"kubernetes.io/projected/60b302ff-f344-4556-92dc-e8f7954b80c9-kube-api-access-qs24h\") pod \"watcher-operator-controller-manager-d64f8f9f6-2qx7f\" (UID: \"60b302ff-f344-4556-92dc-e8f7954b80c9\") " pod="openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.646527 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6e332ad-4dcd-4526-abf5-c79bfce4ee72-cert\") pod \"openstack-operator-controller-manager-5cc886c7f9-x6rkr\" (UID: \"e6e332ad-4dcd-4526-abf5-c79bfce4ee72\") " pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.646555 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhbjl\" (UniqueName: \"kubernetes.io/projected/d08d3102-97ef-4224-8a21-fa66c86a2f11-kube-api-access-lhbjl\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-26s54\" (UID: \"d08d3102-97ef-4224-8a21-fa66c86a2f11\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-26s54" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.646576 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq2nl\" (UniqueName: \"kubernetes.io/projected/e6e332ad-4dcd-4526-abf5-c79bfce4ee72-kube-api-access-qq2nl\") pod \"openstack-operator-controller-manager-5cc886c7f9-x6rkr\" (UID: \"e6e332ad-4dcd-4526-abf5-c79bfce4ee72\") " pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" Oct 01 13:08:51 crc kubenswrapper[4851]: E1001 13:08:51.648210 4851 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 01 13:08:51 crc kubenswrapper[4851]: E1001 13:08:51.648291 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6e332ad-4dcd-4526-abf5-c79bfce4ee72-cert podName:e6e332ad-4dcd-4526-abf5-c79bfce4ee72 nodeName:}" failed. No retries permitted until 2025-10-01 13:08:52.148269459 +0000 UTC m=+940.493387035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e6e332ad-4dcd-4526-abf5-c79bfce4ee72-cert") pod "openstack-operator-controller-manager-5cc886c7f9-x6rkr" (UID: "e6e332ad-4dcd-4526-abf5-c79bfce4ee72") : secret "webhook-server-cert" not found Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.665434 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63075165-70d9-4dd4-9b52-41e59e59fcee-cert\") pod \"infra-operator-controller-manager-9d6c5db85-4chzf\" (UID: \"63075165-70d9-4dd4-9b52-41e59e59fcee\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.671484 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq2nl\" (UniqueName: \"kubernetes.io/projected/e6e332ad-4dcd-4526-abf5-c79bfce4ee72-kube-api-access-qq2nl\") pod \"openstack-operator-controller-manager-5cc886c7f9-x6rkr\" (UID: \"e6e332ad-4dcd-4526-abf5-c79bfce4ee72\") " pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.682148 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs24h\" (UniqueName: \"kubernetes.io/projected/60b302ff-f344-4556-92dc-e8f7954b80c9-kube-api-access-qs24h\") pod \"watcher-operator-controller-manager-d64f8f9f6-2qx7f\" (UID: \"60b302ff-f344-4556-92dc-e8f7954b80c9\") " pod="openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.751910 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhbjl\" (UniqueName: \"kubernetes.io/projected/d08d3102-97ef-4224-8a21-fa66c86a2f11-kube-api-access-lhbjl\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-26s54\" (UID: \"d08d3102-97ef-4224-8a21-fa66c86a2f11\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-26s54" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.779258 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhbjl\" (UniqueName: \"kubernetes.io/projected/d08d3102-97ef-4224-8a21-fa66c86a2f11-kube-api-access-lhbjl\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-26s54\" (UID: \"d08d3102-97ef-4224-8a21-fa66c86a2f11\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-26s54" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.855285 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ac4cdbb-6f06-4525-8f5a-61f81a230708-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8ctmstz\" (UID: \"9ac4cdbb-6f06-4525-8f5a-61f81a230708\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.872629 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ac4cdbb-6f06-4525-8f5a-61f81a230708-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8ctmstz\" (UID: \"9ac4cdbb-6f06-4525-8f5a-61f81a230708\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.873727 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-62wb7" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.887542 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-8c9zh"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.898123 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.901470 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.907974 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nxjn7" event={"ID":"613abfd6-27d5-4f52-bad5-024d71335465","Type":"ContainerStarted","Data":"42a6de12b0f154a696838cfba5f531e5c34f0aeaf697966812c7b819d15d5788"} Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.909098 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.925188 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-ms6rc"] Oct 01 13:08:51 crc kubenswrapper[4851]: I1001 13:08:51.964094 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-26s54" Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.170059 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6e332ad-4dcd-4526-abf5-c79bfce4ee72-cert\") pod \"openstack-operator-controller-manager-5cc886c7f9-x6rkr\" (UID: \"e6e332ad-4dcd-4526-abf5-c79bfce4ee72\") " pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" Oct 01 13:08:52 crc kubenswrapper[4851]: E1001 13:08:52.170529 4851 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 01 13:08:52 crc kubenswrapper[4851]: E1001 13:08:52.170585 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6e332ad-4dcd-4526-abf5-c79bfce4ee72-cert podName:e6e332ad-4dcd-4526-abf5-c79bfce4ee72 nodeName:}" failed. No retries permitted until 2025-10-01 13:08:53.170568102 +0000 UTC m=+941.515685588 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e6e332ad-4dcd-4526-abf5-c79bfce4ee72-cert") pod "openstack-operator-controller-manager-5cc886c7f9-x6rkr" (UID: "e6e332ad-4dcd-4526-abf5-c79bfce4ee72") : secret "webhook-server-cert" not found Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.295777 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-knmsq"] Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.311977 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-r7gqw"] Oct 01 13:08:52 crc kubenswrapper[4851]: W1001 13:08:52.315257 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dcb48dd_5baf_415e_861e_ebfa40fc2e84.slice/crio-42b56944f38341d5ec3977bf80d01c3605888b8d57d7176a0c3c60af41b2b6fd WatchSource:0}: Error finding container 42b56944f38341d5ec3977bf80d01c3605888b8d57d7176a0c3c60af41b2b6fd: Status 404 returned error can't find the container with id 42b56944f38341d5ec3977bf80d01c3605888b8d57d7176a0c3c60af41b2b6fd Oct 01 13:08:52 crc kubenswrapper[4851]: W1001 13:08:52.319218 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba805aa7_4c6e_4dc1_8de8_c935ab1c2128.slice/crio-286202ed01fd4404f0cc555c2a8e2196c77ac388c7f02cc1e21219ea26102389 WatchSource:0}: Error finding container 286202ed01fd4404f0cc555c2a8e2196c77ac388c7f02cc1e21219ea26102389: Status 404 returned error can't find the container with id 286202ed01fd4404f0cc555c2a8e2196c77ac388c7f02cc1e21219ea26102389 Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.386584 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h"] Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.490024 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-dq52c"] Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.511177 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-flp6z"] Oct 01 13:08:52 crc kubenswrapper[4851]: W1001 13:08:52.521563 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7cf8be8_c21f_4904_838e_f185857ef960.slice/crio-6934379e0d854f2ba4829ad0eab348207d70b0afad85f744274e311f257a6b8a WatchSource:0}: Error finding container 6934379e0d854f2ba4829ad0eab348207d70b0afad85f744274e311f257a6b8a: Status 404 returned error can't find the container with id 6934379e0d854f2ba4829ad0eab348207d70b0afad85f744274e311f257a6b8a Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.557865 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp"] Oct 01 13:08:52 crc kubenswrapper[4851]: W1001 13:08:52.558901 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda01e01ff_d00e_482d_8901_72c0705672f1.slice/crio-22e23e8a214fbf4730007f3458de47d6ab779b0c27428388fa1d5219e32e9631 WatchSource:0}: Error finding container 22e23e8a214fbf4730007f3458de47d6ab779b0c27428388fa1d5219e32e9631: Status 404 returned error can't find the container with id 22e23e8a214fbf4730007f3458de47d6ab779b0c27428388fa1d5219e32e9631 Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.646853 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-cslt7"] Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.654953 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-qjqpv"] Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.665483 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z"] Oct 01 13:08:52 crc kubenswrapper[4851]: W1001 13:08:52.671019 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde177c71_0c78_416b_a62f_2d73d86a2b70.slice/crio-a81877aa0a67fa84dd9d05362f1d418677b4ce574726336f7510a4142c6cc1bc WatchSource:0}: Error finding container a81877aa0a67fa84dd9d05362f1d418677b4ce574726336f7510a4142c6cc1bc: Status 404 returned error can't find the container with id a81877aa0a67fa84dd9d05362f1d418677b4ce574726336f7510a4142c6cc1bc Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.677086 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s"] Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.748250 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5"] Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.758023 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc"] Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.764663 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-62wb7"] Oct 01 13:08:52 crc kubenswrapper[4851]: E1001 13:08:52.765776 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q7g57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-b8d54b5d7-zzts5_openstack-operators(c9b7da57-564a-4053-a476-08db0d87c317): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:08:52 crc kubenswrapper[4851]: W1001 13:08:52.766848 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7abbaca1_f067_43b8_a24e_0219ce7e7eaa.slice/crio-0da39ae22076cfa8fdebad9382612575c908066681106188f2e11a77af2bdffb WatchSource:0}: Error finding container 0da39ae22076cfa8fdebad9382612575c908066681106188f2e11a77af2bdffb: Status 404 returned error can't find the container with id 0da39ae22076cfa8fdebad9382612575c908066681106188f2e11a77af2bdffb Oct 01 13:08:52 crc kubenswrapper[4851]: E1001 13:08:52.767150 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qstc2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-85777745bb-62wb7_openstack-operators(5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.772196 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2"] Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.778867 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6"] Oct 01 13:08:52 crc kubenswrapper[4851]: E1001 13:08:52.784423 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l56cs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-64cd67b5cb-mkxn2_openstack-operators(7abbaca1-f067-43b8-a24e-0219ce7e7eaa): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.879168 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf"] Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.890460 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz"] Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.898609 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-26s54"] Oct 01 13:08:52 crc kubenswrapper[4851]: E1001 13:08:52.904123 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2vbvj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-77b9676b8ctmstz_openstack-operators(9ac4cdbb-6f06-4525-8f5a-61f81a230708): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.909553 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f"] Oct 01 13:08:52 crc kubenswrapper[4851]: W1001 13:08:52.918576 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd08d3102_97ef_4224_8a21_fa66c86a2f11.slice/crio-f9e30db9bada575b564f31b390d4fb43465203a4063bc762d05db0824519edd6 WatchSource:0}: Error finding container f9e30db9bada575b564f31b390d4fb43465203a4063bc762d05db0824519edd6: Status 404 returned error can't find the container with id f9e30db9bada575b564f31b390d4fb43465203a4063bc762d05db0824519edd6 Oct 01 13:08:52 crc kubenswrapper[4851]: E1001 13:08:52.921839 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lhbjl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-26s54_openstack-operators(d08d3102-97ef-4224-8a21-fa66c86a2f11): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.922401 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qjqpv" event={"ID":"de177c71-0c78-416b-a62f-2d73d86a2b70","Type":"ContainerStarted","Data":"a81877aa0a67fa84dd9d05362f1d418677b4ce574726336f7510a4142c6cc1bc"} Oct 01 13:08:52 crc kubenswrapper[4851]: E1001 13:08:52.923047 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-26s54" podUID="d08d3102-97ef-4224-8a21-fa66c86a2f11" Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.924142 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-dq52c" event={"ID":"ac9e1ccb-a68d-446e-b47b-de00d828f332","Type":"ContainerStarted","Data":"065d58fc7462413c63c1f3a5da6b6c716a323d6b01a32b02bd9945fe2e434706"} Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.926035 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-62wb7" event={"ID":"5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0","Type":"ContainerStarted","Data":"2bc4c9ff52ea0bb82c34dbf0cb02f6d3d140bedf72646226aed394332c8d57d4"} Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.928178 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc" event={"ID":"9389dca9-fd43-4f83-a3a8-b755859b252e","Type":"ContainerStarted","Data":"ca2d7dca05a2b8c121bb1e5fded891577d03707e9e9c0a9f7eb499c3a10add53"} Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.929994 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp" event={"ID":"a01e01ff-d00e-482d-8901-72c0705672f1","Type":"ContainerStarted","Data":"22e23e8a214fbf4730007f3458de47d6ab779b0c27428388fa1d5219e32e9631"} Oct 01 13:08:52 crc kubenswrapper[4851]: W1001 13:08:52.934268 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60b302ff_f344_4556_92dc_e8f7954b80c9.slice/crio-ac0198dd7f93992c34e7d81d825c103378184783c5cc8c0b40cb18c2f6f14bf3 WatchSource:0}: Error finding container ac0198dd7f93992c34e7d81d825c103378184783c5cc8c0b40cb18c2f6f14bf3: Status 404 returned error can't find the container with id ac0198dd7f93992c34e7d81d825c103378184783c5cc8c0b40cb18c2f6f14bf3 Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.935171 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-cslt7" event={"ID":"d7ddf969-982c-4436-85e4-fb963d57a385","Type":"ContainerStarted","Data":"f09735163cac0eda2b20f94d4876a1eede35d84eb613237627a3eb19a38cf1f6"} Oct 01 13:08:52 crc kubenswrapper[4851]: W1001 13:08:52.942486 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63075165_70d9_4dd4_9b52_41e59e59fcee.slice/crio-d62e2ce830147f6d1d73efe872aad98fc81c1b3475f026dfaa71a80db2dd00db WatchSource:0}: Error finding container d62e2ce830147f6d1d73efe872aad98fc81c1b3475f026dfaa71a80db2dd00db: Status 404 returned error can't find the container with id d62e2ce830147f6d1d73efe872aad98fc81c1b3475f026dfaa71a80db2dd00db Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.942975 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8c9zh" event={"ID":"a1e24a95-87b4-4f06-b651-9c8c26a7021d","Type":"ContainerStarted","Data":"dc8654fda3efc3140a5b735574a62f87690af20eaeae83c04fa5c9a77d6b3e04"} Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.944375 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-knmsq" event={"ID":"4dcb48dd-5baf-415e-861e-ebfa40fc2e84","Type":"ContainerStarted","Data":"42b56944f38341d5ec3977bf80d01c3605888b8d57d7176a0c3c60af41b2b6fd"} Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.945629 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s" event={"ID":"5d973d57-3aa0-4d14-9c4a-435f6ff880af","Type":"ContainerStarted","Data":"032b417d34bddf16f60a8c907f001ca5c5d44ba804edb416d0d7b60b5301896f"} Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.946740 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6" event={"ID":"ee6067ef-427a-49d8-99b6-694930e44a0d","Type":"ContainerStarted","Data":"09c02d293b8593dc5954361bdad022c15b1f8779e67cc7fca7f0e6851f830cac"} Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.948671 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z" event={"ID":"f3be6466-a46d-49b7-a5e4-9465c82ce165","Type":"ContainerStarted","Data":"d236da132bdea87fcad6d53de26f027c7cab715216e4a8e5d934248a3b2796d8"} Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.949817 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2" event={"ID":"7abbaca1-f067-43b8-a24e-0219ce7e7eaa","Type":"ContainerStarted","Data":"0da39ae22076cfa8fdebad9382612575c908066681106188f2e11a77af2bdffb"} Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.951464 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" event={"ID":"9ac4cdbb-6f06-4525-8f5a-61f81a230708","Type":"ContainerStarted","Data":"8d4f0f25b03d329a438448cf621e0de5c08e3e808ea58eb8d32e10634f9337a7"} Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.954781 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-flp6z" event={"ID":"a7cf8be8-c21f-4904-838e-f185857ef960","Type":"ContainerStarted","Data":"6934379e0d854f2ba4829ad0eab348207d70b0afad85f744274e311f257a6b8a"} Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.955755 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5" event={"ID":"c9b7da57-564a-4053-a476-08db0d87c317","Type":"ContainerStarted","Data":"51f4a4c6f572505088dcfd4bc31192f26f48e1da0d5854adda2f0f14d3d3dca8"} Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.957801 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-ms6rc" event={"ID":"08cf0ffd-5ec4-4406-a912-319a1c9ced15","Type":"ContainerStarted","Data":"eb873138cd67f4ec454a507a932a0429f186e6aadf7f007932593b35d91d083b"} Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.958874 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h" event={"ID":"324c39a8-85b5-4caf-a719-a5f47a827d08","Type":"ContainerStarted","Data":"c2f86653d7bd04d108f47efbc59d5949489b744e0ea2b5ebeb25ef14f3679422"} Oct 01 13:08:52 crc kubenswrapper[4851]: I1001 13:08:52.961100 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-r7gqw" event={"ID":"ba805aa7-4c6e-4dc1-8de8-c935ab1c2128","Type":"ContainerStarted","Data":"286202ed01fd4404f0cc555c2a8e2196c77ac388c7f02cc1e21219ea26102389"} Oct 01 13:08:52 crc kubenswrapper[4851]: E1001 13:08:52.962747 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.36:5001/openstack-k8s-operators/watcher-operator:b08782af89d66d20cc8c642c5fce3917a5bca481,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qs24h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-d64f8f9f6-2qx7f_openstack-operators(60b302ff-f344-4556-92dc-e8f7954b80c9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:08:52 crc kubenswrapper[4851]: E1001 13:08:52.963098 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7dzf7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-9d6c5db85-4chzf_openstack-operators(63075165-70d9-4dd4-9b52-41e59e59fcee): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:08:53 crc kubenswrapper[4851]: E1001 13:08:53.174311 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-62wb7" podUID="5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0" Oct 01 13:08:53 crc kubenswrapper[4851]: I1001 13:08:53.184182 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6e332ad-4dcd-4526-abf5-c79bfce4ee72-cert\") pod \"openstack-operator-controller-manager-5cc886c7f9-x6rkr\" (UID: \"e6e332ad-4dcd-4526-abf5-c79bfce4ee72\") " pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" Oct 01 13:08:53 crc kubenswrapper[4851]: E1001 13:08:53.202014 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5" podUID="c9b7da57-564a-4053-a476-08db0d87c317" Oct 01 13:08:53 crc kubenswrapper[4851]: E1001 13:08:53.207894 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2" podUID="7abbaca1-f067-43b8-a24e-0219ce7e7eaa" Oct 01 13:08:53 crc kubenswrapper[4851]: I1001 13:08:53.211946 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6e332ad-4dcd-4526-abf5-c79bfce4ee72-cert\") pod \"openstack-operator-controller-manager-5cc886c7f9-x6rkr\" (UID: \"e6e332ad-4dcd-4526-abf5-c79bfce4ee72\") " pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" Oct 01 13:08:53 crc kubenswrapper[4851]: E1001 13:08:53.394709 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" podUID="63075165-70d9-4dd4-9b52-41e59e59fcee" Oct 01 13:08:53 crc kubenswrapper[4851]: E1001 13:08:53.415137 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" podUID="9ac4cdbb-6f06-4525-8f5a-61f81a230708" Oct 01 13:08:53 crc kubenswrapper[4851]: I1001 13:08:53.439773 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" Oct 01 13:08:53 crc kubenswrapper[4851]: E1001 13:08:53.453519 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f" podUID="60b302ff-f344-4556-92dc-e8f7954b80c9" Oct 01 13:08:53 crc kubenswrapper[4851]: I1001 13:08:53.979298 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr"] Oct 01 13:08:53 crc kubenswrapper[4851]: I1001 13:08:53.985750 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" event={"ID":"9ac4cdbb-6f06-4525-8f5a-61f81a230708","Type":"ContainerStarted","Data":"46eece297cbfb4cf6cf2b68dd0cfcaae1efc0de77f679727c500abff15e6abcb"} Oct 01 13:08:53 crc kubenswrapper[4851]: E1001 13:08:53.987744 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" podUID="9ac4cdbb-6f06-4525-8f5a-61f81a230708" Oct 01 13:08:54 crc kubenswrapper[4851]: I1001 13:08:54.018626 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5" event={"ID":"c9b7da57-564a-4053-a476-08db0d87c317","Type":"ContainerStarted","Data":"1cc054f17016ee0ea1d5611fca25567599713904c752c99956fed212be5cd69e"} Oct 01 13:08:54 crc kubenswrapper[4851]: E1001 13:08:54.038790 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5" podUID="c9b7da57-564a-4053-a476-08db0d87c317" Oct 01 13:08:54 crc kubenswrapper[4851]: I1001 13:08:54.046703 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" event={"ID":"63075165-70d9-4dd4-9b52-41e59e59fcee","Type":"ContainerStarted","Data":"6488cdf2cdcedab8afa08214aae58c90a4010aa2989b63ef036773e0ffaa5b9c"} Oct 01 13:08:54 crc kubenswrapper[4851]: I1001 13:08:54.046739 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" event={"ID":"63075165-70d9-4dd4-9b52-41e59e59fcee","Type":"ContainerStarted","Data":"d62e2ce830147f6d1d73efe872aad98fc81c1b3475f026dfaa71a80db2dd00db"} Oct 01 13:08:54 crc kubenswrapper[4851]: E1001 13:08:54.047785 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" podUID="63075165-70d9-4dd4-9b52-41e59e59fcee" Oct 01 13:08:54 crc kubenswrapper[4851]: I1001 13:08:54.048075 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2" event={"ID":"7abbaca1-f067-43b8-a24e-0219ce7e7eaa","Type":"ContainerStarted","Data":"6384ea8531cac4c17c20f456fad3602412deca0bb3b13208ccec901d74663d22"} Oct 01 13:08:54 crc kubenswrapper[4851]: E1001 13:08:54.049871 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2" podUID="7abbaca1-f067-43b8-a24e-0219ce7e7eaa" Oct 01 13:08:54 crc kubenswrapper[4851]: I1001 13:08:54.074265 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f" event={"ID":"60b302ff-f344-4556-92dc-e8f7954b80c9","Type":"ContainerStarted","Data":"9a7eeaf07816aad5b32ca2091836f5dc4f720a360e7fcc4789a3638097d98ba3"} Oct 01 13:08:54 crc kubenswrapper[4851]: I1001 13:08:54.074350 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f" event={"ID":"60b302ff-f344-4556-92dc-e8f7954b80c9","Type":"ContainerStarted","Data":"ac0198dd7f93992c34e7d81d825c103378184783c5cc8c0b40cb18c2f6f14bf3"} Oct 01 13:08:54 crc kubenswrapper[4851]: E1001 13:08:54.076062 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/openstack-k8s-operators/watcher-operator:b08782af89d66d20cc8c642c5fce3917a5bca481\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f" podUID="60b302ff-f344-4556-92dc-e8f7954b80c9" Oct 01 13:08:54 crc kubenswrapper[4851]: I1001 13:08:54.085810 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-26s54" event={"ID":"d08d3102-97ef-4224-8a21-fa66c86a2f11","Type":"ContainerStarted","Data":"f9e30db9bada575b564f31b390d4fb43465203a4063bc762d05db0824519edd6"} Oct 01 13:08:54 crc kubenswrapper[4851]: E1001 13:08:54.087832 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-26s54" podUID="d08d3102-97ef-4224-8a21-fa66c86a2f11" Oct 01 13:08:54 crc kubenswrapper[4851]: I1001 13:08:54.102382 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-62wb7" event={"ID":"5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0","Type":"ContainerStarted","Data":"584cce70bd27f9eaaaac5737ebf14ff52c51d1ced8b85a7a4df5dd91c58348c4"} Oct 01 13:08:54 crc kubenswrapper[4851]: E1001 13:08:54.108633 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-62wb7" podUID="5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0" Oct 01 13:08:55 crc kubenswrapper[4851]: I1001 13:08:55.132887 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" event={"ID":"e6e332ad-4dcd-4526-abf5-c79bfce4ee72","Type":"ContainerStarted","Data":"daf350c604a1ef9879b9923646b5fbf71ad0f353b4a6f8a49bb2e1859d1f4741"} Oct 01 13:08:55 crc kubenswrapper[4851]: I1001 13:08:55.133228 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" event={"ID":"e6e332ad-4dcd-4526-abf5-c79bfce4ee72","Type":"ContainerStarted","Data":"4bdc2829c798e31a1af3b8a20ff9997c02b6873ac9d55fcd58cb554ae69e5fae"} Oct 01 13:08:55 crc kubenswrapper[4851]: E1001 13:08:55.134595 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" podUID="9ac4cdbb-6f06-4525-8f5a-61f81a230708" Oct 01 13:08:55 crc kubenswrapper[4851]: E1001 13:08:55.135907 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-26s54" podUID="d08d3102-97ef-4224-8a21-fa66c86a2f11" Oct 01 13:08:55 crc kubenswrapper[4851]: E1001 13:08:55.136404 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2" podUID="7abbaca1-f067-43b8-a24e-0219ce7e7eaa" Oct 01 13:08:55 crc kubenswrapper[4851]: E1001 13:08:55.136547 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5" podUID="c9b7da57-564a-4053-a476-08db0d87c317" Oct 01 13:08:55 crc kubenswrapper[4851]: E1001 13:08:55.136709 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-62wb7" podUID="5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0" Oct 01 13:08:55 crc kubenswrapper[4851]: E1001 13:08:55.141767 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/openstack-k8s-operators/watcher-operator:b08782af89d66d20cc8c642c5fce3917a5bca481\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f" podUID="60b302ff-f344-4556-92dc-e8f7954b80c9" Oct 01 13:08:55 crc kubenswrapper[4851]: E1001 13:08:55.141775 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" podUID="63075165-70d9-4dd4-9b52-41e59e59fcee" Oct 01 13:09:00 crc kubenswrapper[4851]: I1001 13:09:00.050353 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:09:00 crc kubenswrapper[4851]: I1001 13:09:00.050458 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:09:00 crc kubenswrapper[4851]: I1001 13:09:00.050580 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 13:09:00 crc kubenswrapper[4851]: I1001 13:09:00.051836 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be5d6b868e9238c5d4395c014452c5cfe7dc87bf6a9741e8af0bded2d6b25de6"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:09:00 crc kubenswrapper[4851]: I1001 13:09:00.051929 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://be5d6b868e9238c5d4395c014452c5cfe7dc87bf6a9741e8af0bded2d6b25de6" gracePeriod=600 Oct 01 13:09:01 crc kubenswrapper[4851]: I1001 13:09:01.182136 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="be5d6b868e9238c5d4395c014452c5cfe7dc87bf6a9741e8af0bded2d6b25de6" exitCode=0 Oct 01 13:09:01 crc kubenswrapper[4851]: I1001 13:09:01.182234 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"be5d6b868e9238c5d4395c014452c5cfe7dc87bf6a9741e8af0bded2d6b25de6"} Oct 01 13:09:01 crc kubenswrapper[4851]: I1001 13:09:01.182806 4851 scope.go:117] "RemoveContainer" containerID="288531af0e115f9595ac6f6f759c2572ba4e5c19461b4094fb567dd41bccf2dd" Oct 01 13:09:05 crc kubenswrapper[4851]: E1001 13:09:05.699399 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302" Oct 01 13:09:05 crc kubenswrapper[4851]: E1001 13:09:05.700891 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fv6hb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-c8r2z_openstack-operators(f3be6466-a46d-49b7-a5e4-9465c82ce165): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:09:07 crc kubenswrapper[4851]: E1001 13:09:07.044543 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:f5f0d2eb534f763cf6578af513add1c21c1659b2cd75214dfddfedb9eebf6397" Oct 01 13:09:07 crc kubenswrapper[4851]: E1001 13:09:07.045313 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:f5f0d2eb534f763cf6578af513add1c21c1659b2cd75214dfddfedb9eebf6397,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p9dbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-9f4696d94-tssnp_openstack-operators(a01e01ff-d00e-482d-8901-72c0705672f1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:09:24 crc kubenswrapper[4851]: E1001 13:09:24.842789 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9" Oct 01 13:09:24 crc kubenswrapper[4851]: E1001 13:09:24.843368 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f9bvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7b787867f4-qxlj6_openstack-operators(ee6067ef-427a-49d8-99b6-694930e44a0d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:09:25 crc kubenswrapper[4851]: E1001 13:09:25.227353 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8" Oct 01 13:09:25 crc kubenswrapper[4851]: E1001 13:09:25.227882 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j8slw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-849d5b9b84-22n8s_openstack-operators(5d973d57-3aa0-4d14-9c4a-435f6ff880af): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:09:25 crc kubenswrapper[4851]: E1001 13:09:25.940733 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884" Oct 01 13:09:25 crc kubenswrapper[4851]: E1001 13:09:25.940950 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-89b62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6d68dbc695-zx88h_openstack-operators(324c39a8-85b5-4caf-a719-a5f47a827d08): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:09:26 crc kubenswrapper[4851]: E1001 13:09:26.680962 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4" Oct 01 13:09:26 crc kubenswrapper[4851]: E1001 13:09:26.681178 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ks6ws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-84d6b4b759-62xmc_openstack-operators(9389dca9-fd43-4f83-a3a8-b755859b252e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:09:34 crc kubenswrapper[4851]: E1001 13:09:34.280666 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp" podUID="a01e01ff-d00e-482d-8901-72c0705672f1" Oct 01 13:09:34 crc kubenswrapper[4851]: E1001 13:09:34.286052 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z" podUID="f3be6466-a46d-49b7-a5e4-9465c82ce165" Oct 01 13:09:34 crc kubenswrapper[4851]: I1001 13:09:34.493831 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" event={"ID":"e6e332ad-4dcd-4526-abf5-c79bfce4ee72","Type":"ContainerStarted","Data":"f41933a984f5f031030c0f9da79680c44ceb9dbfc690a0a84fecebe0747de483"} Oct 01 13:09:34 crc kubenswrapper[4851]: I1001 13:09:34.494027 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" Oct 01 13:09:34 crc kubenswrapper[4851]: I1001 13:09:34.496883 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp" event={"ID":"a01e01ff-d00e-482d-8901-72c0705672f1","Type":"ContainerStarted","Data":"e8b6146e2d4734e987d822df4b45334bf06e377924175406f690e670f0c2ebb6"} Oct 01 13:09:34 crc kubenswrapper[4851]: I1001 13:09:34.499469 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z" event={"ID":"f3be6466-a46d-49b7-a5e4-9465c82ce165","Type":"ContainerStarted","Data":"91517468ef673493981213e5e094d968d844916232099e3a1924c76a435d451f"} Oct 01 13:09:34 crc kubenswrapper[4851]: I1001 13:09:34.506335 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" Oct 01 13:09:34 crc kubenswrapper[4851]: I1001 13:09:34.526845 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5cc886c7f9-x6rkr" podStartSLOduration=43.52682258 podStartE2EDuration="43.52682258s" podCreationTimestamp="2025-10-01 13:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:09:34.520424919 +0000 UTC m=+982.865542435" watchObservedRunningTime="2025-10-01 13:09:34.52682258 +0000 UTC m=+982.871940066" Oct 01 13:09:35 crc kubenswrapper[4851]: E1001 13:09:35.023425 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6" podUID="ee6067ef-427a-49d8-99b6-694930e44a0d" Oct 01 13:09:35 crc kubenswrapper[4851]: E1001 13:09:35.209318 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h" podUID="324c39a8-85b5-4caf-a719-a5f47a827d08" Oct 01 13:09:35 crc kubenswrapper[4851]: I1001 13:09:35.513272 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6" event={"ID":"ee6067ef-427a-49d8-99b6-694930e44a0d","Type":"ContainerStarted","Data":"61da2d9ab427a9b228faa1d1be11afd65646582555ff93fb1044570fb3fa1b8d"} Oct 01 13:09:35 crc kubenswrapper[4851]: I1001 13:09:35.515281 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" event={"ID":"63075165-70d9-4dd4-9b52-41e59e59fcee","Type":"ContainerStarted","Data":"b683fe16e457ac6f0860afbdced743c5a524dc9c698babd20d6b08905743f7c5"} Oct 01 13:09:35 crc kubenswrapper[4851]: I1001 13:09:35.515470 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" Oct 01 13:09:35 crc kubenswrapper[4851]: I1001 13:09:35.519911 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-ms6rc" event={"ID":"08cf0ffd-5ec4-4406-a912-319a1c9ced15","Type":"ContainerStarted","Data":"7e4a867d8c3d230e94a3d5db4b779b5ecffc8537dd809bffcc27a25e0ad06725"} Oct 01 13:09:35 crc kubenswrapper[4851]: I1001 13:09:35.528971 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"91826d492b7e20fab5770efae0e5337ce96401b4dbf9f9356c89538e943aab30"} Oct 01 13:09:35 crc kubenswrapper[4851]: I1001 13:09:35.533492 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h" event={"ID":"324c39a8-85b5-4caf-a719-a5f47a827d08","Type":"ContainerStarted","Data":"15ca98d0c43a671ab8cf4fa5e501ab38698a4c3742e891d5679d2801a488f0f8"} Oct 01 13:09:35 crc kubenswrapper[4851]: E1001 13:09:35.534679 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h" podUID="324c39a8-85b5-4caf-a719-a5f47a827d08" Oct 01 13:09:35 crc kubenswrapper[4851]: I1001 13:09:35.546166 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-26s54" event={"ID":"d08d3102-97ef-4224-8a21-fa66c86a2f11","Type":"ContainerStarted","Data":"34672828f23293f8ea84d0ab2ac2f6cf4b4db52f27fdbd4f72d738570805617f"} Oct 01 13:09:35 crc kubenswrapper[4851]: E1001 13:09:35.555102 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc" podUID="9389dca9-fd43-4f83-a3a8-b755859b252e" Oct 01 13:09:35 crc kubenswrapper[4851]: I1001 13:09:35.621908 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" podStartSLOduration=3.699087941 podStartE2EDuration="45.621889139s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.962596675 +0000 UTC m=+941.307714161" lastFinishedPulling="2025-10-01 13:09:34.885397873 +0000 UTC m=+983.230515359" observedRunningTime="2025-10-01 13:09:35.614143149 +0000 UTC m=+983.959260635" watchObservedRunningTime="2025-10-01 13:09:35.621889139 +0000 UTC m=+983.967006625" Oct 01 13:09:35 crc kubenswrapper[4851]: I1001 13:09:35.660617 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-26s54" podStartSLOduration=2.812276995 podStartE2EDuration="44.660598668s" podCreationTimestamp="2025-10-01 13:08:51 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.921743645 +0000 UTC m=+941.266861131" lastFinishedPulling="2025-10-01 13:09:34.770065308 +0000 UTC m=+983.115182804" observedRunningTime="2025-10-01 13:09:35.642874675 +0000 UTC m=+983.987992161" watchObservedRunningTime="2025-10-01 13:09:35.660598668 +0000 UTC m=+984.005716154" Oct 01 13:09:35 crc kubenswrapper[4851]: E1001 13:09:35.737837 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s" podUID="5d973d57-3aa0-4d14-9c4a-435f6ff880af" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.604187 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f" event={"ID":"60b302ff-f344-4556-92dc-e8f7954b80c9","Type":"ContainerStarted","Data":"51ed487d4b6425a9bfc7c8accfdfaedf9dee514a6980dfde95cc4f98a1a89521"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.605340 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.627432 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp" event={"ID":"a01e01ff-d00e-482d-8901-72c0705672f1","Type":"ContainerStarted","Data":"69f7eb6054883c0269cd33990aea36ace029b2e428a54ae5ceb1c780f01fcf61"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.628193 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.629577 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z" event={"ID":"f3be6466-a46d-49b7-a5e4-9465c82ce165","Type":"ContainerStarted","Data":"d11e6aca91d08638d86b50267362b54a532f9cec1dfe737b717f7b55e703fd50"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.629996 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.640136 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2" event={"ID":"7abbaca1-f067-43b8-a24e-0219ce7e7eaa","Type":"ContainerStarted","Data":"68c0ccdce98990587934337b93430b2c61ec21be6dcf3eabfe9f5de1c5d135b7"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.640922 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.675919 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-62wb7" event={"ID":"5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0","Type":"ContainerStarted","Data":"8cb2e04e453d4fecd0ea97d6920744e025137b5e3f4ba001cf1b84f81c883118"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.676792 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-85777745bb-62wb7" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.692078 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f" podStartSLOduration=3.6647293960000002 podStartE2EDuration="45.692063282s" podCreationTimestamp="2025-10-01 13:08:51 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.962585875 +0000 UTC m=+941.307703361" lastFinishedPulling="2025-10-01 13:09:34.989919771 +0000 UTC m=+983.335037247" observedRunningTime="2025-10-01 13:09:36.648028262 +0000 UTC m=+984.993145738" watchObservedRunningTime="2025-10-01 13:09:36.692063282 +0000 UTC m=+985.037180768" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.692263 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z" podStartSLOduration=4.247979871 podStartE2EDuration="46.692258278s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.691315451 +0000 UTC m=+941.036432937" lastFinishedPulling="2025-10-01 13:09:35.135593858 +0000 UTC m=+983.480711344" observedRunningTime="2025-10-01 13:09:36.689538281 +0000 UTC m=+985.034655767" watchObservedRunningTime="2025-10-01 13:09:36.692258278 +0000 UTC m=+985.037375764" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.713957 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-cslt7" event={"ID":"d7ddf969-982c-4436-85e4-fb963d57a385","Type":"ContainerStarted","Data":"acd5c1c29f7985c9b10071fab444c30a6661ee0064ece7bdbf75e4fa106e6cb0"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.714756 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-cslt7" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.724292 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp" podStartSLOduration=3.920272044 podStartE2EDuration="46.724276737s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.560907447 +0000 UTC m=+940.906024933" lastFinishedPulling="2025-10-01 13:09:35.36491214 +0000 UTC m=+983.710029626" observedRunningTime="2025-10-01 13:09:36.721372865 +0000 UTC m=+985.066490361" watchObservedRunningTime="2025-10-01 13:09:36.724276737 +0000 UTC m=+985.069394223" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.751945 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2" podStartSLOduration=4.779975198 podStartE2EDuration="46.751931032s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.784304782 +0000 UTC m=+941.129422258" lastFinishedPulling="2025-10-01 13:09:34.756260606 +0000 UTC m=+983.101378092" observedRunningTime="2025-10-01 13:09:36.751788658 +0000 UTC m=+985.096906154" watchObservedRunningTime="2025-10-01 13:09:36.751931032 +0000 UTC m=+985.097048518" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.758334 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s" event={"ID":"5d973d57-3aa0-4d14-9c4a-435f6ff880af","Type":"ContainerStarted","Data":"01fc80f44dd0bedef60f6147cb4bf9d6424c99976801b7b119470044a526c25f"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.780981 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc" event={"ID":"9389dca9-fd43-4f83-a3a8-b755859b252e","Type":"ContainerStarted","Data":"df6594e3d81d9fab772fcd32b1600fa034a9cb19bf4136a71f40b61d0423ca23"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.815801 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-ms6rc" event={"ID":"08cf0ffd-5ec4-4406-a912-319a1c9ced15","Type":"ContainerStarted","Data":"e3b8f84794639a62e1e95d8141927d0028cf41fa5828f09d0a7d5094c7627a52"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.816146 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-ms6rc" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.825467 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-cslt7" podStartSLOduration=10.871114045 podStartE2EDuration="46.82544955s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.665134327 +0000 UTC m=+941.010251813" lastFinishedPulling="2025-10-01 13:09:28.619469802 +0000 UTC m=+976.964587318" observedRunningTime="2025-10-01 13:09:36.824782031 +0000 UTC m=+985.169899517" watchObservedRunningTime="2025-10-01 13:09:36.82544955 +0000 UTC m=+985.170567036" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.825860 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-85777745bb-62wb7" podStartSLOduration=3.750029007 podStartE2EDuration="45.825855902s" podCreationTimestamp="2025-10-01 13:08:51 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.76700884 +0000 UTC m=+941.112126316" lastFinishedPulling="2025-10-01 13:09:34.842835715 +0000 UTC m=+983.187953211" observedRunningTime="2025-10-01 13:09:36.784846537 +0000 UTC m=+985.129964023" watchObservedRunningTime="2025-10-01 13:09:36.825855902 +0000 UTC m=+985.170973388" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.848981 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-dq52c" event={"ID":"ac9e1ccb-a68d-446e-b47b-de00d828f332","Type":"ContainerStarted","Data":"9b9b15d41a33124f759f705a5076b5d0babe81c1d0e0240c49e6278919f8075e"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.849076 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-ms6rc" podStartSLOduration=10.261420331 podStartE2EDuration="46.849057831s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.035718012 +0000 UTC m=+940.380835498" lastFinishedPulling="2025-10-01 13:09:28.623355472 +0000 UTC m=+976.968472998" observedRunningTime="2025-10-01 13:09:36.847748494 +0000 UTC m=+985.192865980" watchObservedRunningTime="2025-10-01 13:09:36.849057831 +0000 UTC m=+985.194175317" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.849416 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-dq52c" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.851609 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" event={"ID":"9ac4cdbb-6f06-4525-8f5a-61f81a230708","Type":"ContainerStarted","Data":"7f54a833ba67e3964aae1f430fc5808a3fadb2a16cf0fdce9a12dc0b165fab2b"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.852672 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.859402 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-knmsq" event={"ID":"4dcb48dd-5baf-415e-861e-ebfa40fc2e84","Type":"ContainerStarted","Data":"27a79d91f269dfe4fc73da5366ba9be11138fff4fc918dbd26adccff9878cde5"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.859438 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-knmsq" event={"ID":"4dcb48dd-5baf-415e-861e-ebfa40fc2e84","Type":"ContainerStarted","Data":"c6cd3faefa5230002fb86de9031461acc5a30f3ec8dbb444e97cf3a82803dd30"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.859722 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-knmsq" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.867060 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8c9zh" event={"ID":"a1e24a95-87b4-4f06-b651-9c8c26a7021d","Type":"ContainerStarted","Data":"e234ca17af326e0e91b729a2c0fb94d875d49bc457d0f6e7fd88fad1a2933236"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.867095 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8c9zh" event={"ID":"a1e24a95-87b4-4f06-b651-9c8c26a7021d","Type":"ContainerStarted","Data":"fdc18a3ff17bc7181980dddbbc321744a00af3d2eb309747af1df961c1acd515"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.867665 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8c9zh" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.868777 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-flp6z" event={"ID":"a7cf8be8-c21f-4904-838e-f185857ef960","Type":"ContainerStarted","Data":"31549c12203facd919937c41d781fb91a4ea931c21ed6f401a37f4e74f044a91"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.869131 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-flp6z" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.870133 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5" event={"ID":"c9b7da57-564a-4053-a476-08db0d87c317","Type":"ContainerStarted","Data":"9737aee3315d96dec6c8ba6f717de8364350412c11070f3ddfc24330a6aa6838"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.870476 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.887177 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qjqpv" event={"ID":"de177c71-0c78-416b-a62f-2d73d86a2b70","Type":"ContainerStarted","Data":"10c71955a6cbcb59d63f82a2239198db77a601aa828cc9bc498cd1870d30c912"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.887212 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qjqpv" event={"ID":"de177c71-0c78-416b-a62f-2d73d86a2b70","Type":"ContainerStarted","Data":"1712be23a7cc346d03ff42244fd95c7389b9fec5736805edfb5c7e52edcc115d"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.887758 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qjqpv" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.914045 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nxjn7" event={"ID":"613abfd6-27d5-4f52-bad5-024d71335465","Type":"ContainerStarted","Data":"d74a481d1083c73d994b858a23255b9239b9f672fbcb0c63ffd18bca4b9f4b5f"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.914592 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nxjn7" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.934564 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-r7gqw" event={"ID":"ba805aa7-4c6e-4dc1-8de8-c935ab1c2128","Type":"ContainerStarted","Data":"ef99f6d0b1e850d17576d0844534807ca5fbfea90202d3fa5f8b0501ba1e5d8f"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.934636 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-r7gqw" event={"ID":"ba805aa7-4c6e-4dc1-8de8-c935ab1c2128","Type":"ContainerStarted","Data":"ff4bf8dcd16088ca57726adf9682014e2410bb1f858db0f61c4f5eda9d995251"} Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.935778 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-r7gqw" Oct 01 13:09:36 crc kubenswrapper[4851]: I1001 13:09:36.947686 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8c9zh" podStartSLOduration=10.328848745 podStartE2EDuration="46.947664681s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.000655466 +0000 UTC m=+940.345772952" lastFinishedPulling="2025-10-01 13:09:28.619471372 +0000 UTC m=+976.964588888" observedRunningTime="2025-10-01 13:09:36.942920617 +0000 UTC m=+985.288038113" watchObservedRunningTime="2025-10-01 13:09:36.947664681 +0000 UTC m=+985.292782167" Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.033675 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-flp6z" podStartSLOduration=11.800889992 podStartE2EDuration="47.033659504s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.52966679 +0000 UTC m=+940.874784276" lastFinishedPulling="2025-10-01 13:09:27.762436292 +0000 UTC m=+976.107553788" observedRunningTime="2025-10-01 13:09:36.967023271 +0000 UTC m=+985.312140757" watchObservedRunningTime="2025-10-01 13:09:37.033659504 +0000 UTC m=+985.378776990" Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.038548 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" podStartSLOduration=5.170962742 podStartE2EDuration="47.038538682s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.903291841 +0000 UTC m=+941.248409327" lastFinishedPulling="2025-10-01 13:09:34.770867781 +0000 UTC m=+983.115985267" observedRunningTime="2025-10-01 13:09:37.032078419 +0000 UTC m=+985.377195905" watchObservedRunningTime="2025-10-01 13:09:37.038538682 +0000 UTC m=+985.383656168" Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.065243 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5" podStartSLOduration=3.944417949 podStartE2EDuration="46.06522783s" podCreationTimestamp="2025-10-01 13:08:51 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.765619291 +0000 UTC m=+941.110736787" lastFinishedPulling="2025-10-01 13:09:34.886429182 +0000 UTC m=+983.231546668" observedRunningTime="2025-10-01 13:09:37.063782899 +0000 UTC m=+985.408900385" watchObservedRunningTime="2025-10-01 13:09:37.06522783 +0000 UTC m=+985.410345316" Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.089022 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qjqpv" podStartSLOduration=12.507281543 podStartE2EDuration="47.089004866s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.67507695 +0000 UTC m=+941.020194436" lastFinishedPulling="2025-10-01 13:09:27.256800233 +0000 UTC m=+975.601917759" observedRunningTime="2025-10-01 13:09:37.085033383 +0000 UTC m=+985.430150869" watchObservedRunningTime="2025-10-01 13:09:37.089004866 +0000 UTC m=+985.434122352" Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.121832 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-dq52c" podStartSLOduration=10.987209353 podStartE2EDuration="47.121809297s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.507084209 +0000 UTC m=+940.852201695" lastFinishedPulling="2025-10-01 13:09:28.641684113 +0000 UTC m=+976.986801639" observedRunningTime="2025-10-01 13:09:37.117806674 +0000 UTC m=+985.462924160" watchObservedRunningTime="2025-10-01 13:09:37.121809297 +0000 UTC m=+985.466926783" Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.155020 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-knmsq" podStartSLOduration=10.836192434 podStartE2EDuration="47.15500374s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.32298164 +0000 UTC m=+940.668099126" lastFinishedPulling="2025-10-01 13:09:28.641792916 +0000 UTC m=+976.986910432" observedRunningTime="2025-10-01 13:09:37.147878218 +0000 UTC m=+985.492995704" watchObservedRunningTime="2025-10-01 13:09:37.15500374 +0000 UTC m=+985.500121226" Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.175011 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nxjn7" podStartSLOduration=11.639015554 podStartE2EDuration="47.174997338s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:51.720948893 +0000 UTC m=+940.066066379" lastFinishedPulling="2025-10-01 13:09:27.256930677 +0000 UTC m=+975.602048163" observedRunningTime="2025-10-01 13:09:37.169118601 +0000 UTC m=+985.514236087" watchObservedRunningTime="2025-10-01 13:09:37.174997338 +0000 UTC m=+985.520114824" Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.943647 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-cslt7" event={"ID":"d7ddf969-982c-4436-85e4-fb963d57a385","Type":"ContainerStarted","Data":"f12fa67a9203953c60c641ccbb3519991f02a7f901647a97569511647dfdc1b2"} Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.945894 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nxjn7" event={"ID":"613abfd6-27d5-4f52-bad5-024d71335465","Type":"ContainerStarted","Data":"3cc09d611376e2dd8d3557786f7ce4c3f9072b7a99ba6d8d6ceec54390ad9557"} Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.947655 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-flp6z" event={"ID":"a7cf8be8-c21f-4904-838e-f185857ef960","Type":"ContainerStarted","Data":"f06b9f2e936d0e65d2d281928e4fa87f94de2819de1545c491c2fe34b002d680"} Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.949785 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s" event={"ID":"5d973d57-3aa0-4d14-9c4a-435f6ff880af","Type":"ContainerStarted","Data":"89e3f7dbd7694823165664f89aab16886f19793a6029b012779ccbb985371be1"} Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.949923 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s" Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.951732 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc" event={"ID":"9389dca9-fd43-4f83-a3a8-b755859b252e","Type":"ContainerStarted","Data":"72022e7d679e0bb0c156ba763a42e9f7cb52853948ab519adc8da2757d906a00"} Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.952252 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc" Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.956466 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6" event={"ID":"ee6067ef-427a-49d8-99b6-694930e44a0d","Type":"ContainerStarted","Data":"7acdb9879bf8c7b9e91282298895af562526cdb5955017202284f268c322721b"} Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.957225 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6" Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.959400 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-dq52c" event={"ID":"ac9e1ccb-a68d-446e-b47b-de00d828f332","Type":"ContainerStarted","Data":"d8ac79cd1cbca733d77cfc1e91094f4ca165721e30316f202d257be917b338f7"} Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.963784 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h" event={"ID":"324c39a8-85b5-4caf-a719-a5f47a827d08","Type":"ContainerStarted","Data":"4e2f486eb2cb7a5fe76c866e3aa21b9d58b3899355cd856f91b829265f30f019"} Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.964230 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h" Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.967158 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-r7gqw" podStartSLOduration=11.663053557 podStartE2EDuration="47.967145935s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.327958331 +0000 UTC m=+940.673075817" lastFinishedPulling="2025-10-01 13:09:28.632050669 +0000 UTC m=+976.977168195" observedRunningTime="2025-10-01 13:09:37.210025472 +0000 UTC m=+985.555142958" watchObservedRunningTime="2025-10-01 13:09:37.967145935 +0000 UTC m=+986.312263421" Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.986021 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc" podStartSLOduration=2.522046415 podStartE2EDuration="46.986004061s" podCreationTimestamp="2025-10-01 13:08:51 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.765609811 +0000 UTC m=+941.110727297" lastFinishedPulling="2025-10-01 13:09:37.229567467 +0000 UTC m=+985.574684943" observedRunningTime="2025-10-01 13:09:37.985764064 +0000 UTC m=+986.330881550" watchObservedRunningTime="2025-10-01 13:09:37.986004061 +0000 UTC m=+986.331121547" Oct 01 13:09:37 crc kubenswrapper[4851]: I1001 13:09:37.991894 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s" podStartSLOduration=3.031266986 podStartE2EDuration="47.991875098s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.689662494 +0000 UTC m=+941.034779980" lastFinishedPulling="2025-10-01 13:09:37.650270606 +0000 UTC m=+985.995388092" observedRunningTime="2025-10-01 13:09:37.967331951 +0000 UTC m=+986.312449437" watchObservedRunningTime="2025-10-01 13:09:37.991875098 +0000 UTC m=+986.336992584" Oct 01 13:09:38 crc kubenswrapper[4851]: I1001 13:09:38.019314 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h" podStartSLOduration=2.8408479079999998 podStartE2EDuration="48.019288566s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.416879397 +0000 UTC m=+940.761996883" lastFinishedPulling="2025-10-01 13:09:37.595320055 +0000 UTC m=+985.940437541" observedRunningTime="2025-10-01 13:09:38.014715736 +0000 UTC m=+986.359833222" watchObservedRunningTime="2025-10-01 13:09:38.019288566 +0000 UTC m=+986.364406052" Oct 01 13:09:38 crc kubenswrapper[4851]: I1001 13:09:38.033726 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6" podStartSLOduration=4.71801597 podStartE2EDuration="48.033709326s" podCreationTimestamp="2025-10-01 13:08:50 +0000 UTC" firstStartedPulling="2025-10-01 13:08:52.800997516 +0000 UTC m=+941.146115002" lastFinishedPulling="2025-10-01 13:09:36.116690882 +0000 UTC m=+984.461808358" observedRunningTime="2025-10-01 13:09:38.032005608 +0000 UTC m=+986.377123084" watchObservedRunningTime="2025-10-01 13:09:38.033709326 +0000 UTC m=+986.378826812" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.047215 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nxjn7" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.063297 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-ms6rc" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.097675 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-knmsq" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.121642 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-8c9zh" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.181818 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-r7gqw" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.220926 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-tssnp" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.335582 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-dq52c" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.362405 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-flp6z" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.472645 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qjqpv" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.516692 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mkxn2" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.521376 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-zzts5" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.543078 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-qxlj6" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.577895 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-c8r2z" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.636107 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-cslt7" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.878823 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-85777745bb-62wb7" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.901406 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-d64f8f9f6-2qx7f" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.916011 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8ctmstz" Oct 01 13:09:41 crc kubenswrapper[4851]: I1001 13:09:41.921695 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-4chzf" Oct 01 13:09:51 crc kubenswrapper[4851]: I1001 13:09:51.460850 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-zx88h" Oct 01 13:09:51 crc kubenswrapper[4851]: I1001 13:09:51.492311 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-22n8s" Oct 01 13:09:51 crc kubenswrapper[4851]: I1001 13:09:51.501023 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62xmc" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.301003 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dd886f455-5sn9b"] Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.304100 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd886f455-5sn9b" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.305759 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-fvwjx" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.306318 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.306584 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.307073 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.312123 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dd886f455-5sn9b"] Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.352335 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035372ef-7ca6-4afa-bcd7-1e56f5a79a0f-config\") pod \"dnsmasq-dns-6dd886f455-5sn9b\" (UID: \"035372ef-7ca6-4afa-bcd7-1e56f5a79a0f\") " pod="openstack/dnsmasq-dns-6dd886f455-5sn9b" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.352390 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4z2n\" (UniqueName: \"kubernetes.io/projected/035372ef-7ca6-4afa-bcd7-1e56f5a79a0f-kube-api-access-w4z2n\") pod \"dnsmasq-dns-6dd886f455-5sn9b\" (UID: \"035372ef-7ca6-4afa-bcd7-1e56f5a79a0f\") " pod="openstack/dnsmasq-dns-6dd886f455-5sn9b" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.401423 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77cbcfdc8f-v52m6"] Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.403223 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.405123 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.425297 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77cbcfdc8f-v52m6"] Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.454866 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0db2d45b-2535-4e9b-8109-b3c8bde34391-dns-svc\") pod \"dnsmasq-dns-77cbcfdc8f-v52m6\" (UID: \"0db2d45b-2535-4e9b-8109-b3c8bde34391\") " pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.454910 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0db2d45b-2535-4e9b-8109-b3c8bde34391-config\") pod \"dnsmasq-dns-77cbcfdc8f-v52m6\" (UID: \"0db2d45b-2535-4e9b-8109-b3c8bde34391\") " pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.454956 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035372ef-7ca6-4afa-bcd7-1e56f5a79a0f-config\") pod \"dnsmasq-dns-6dd886f455-5sn9b\" (UID: \"035372ef-7ca6-4afa-bcd7-1e56f5a79a0f\") " pod="openstack/dnsmasq-dns-6dd886f455-5sn9b" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.455001 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4z2n\" (UniqueName: \"kubernetes.io/projected/035372ef-7ca6-4afa-bcd7-1e56f5a79a0f-kube-api-access-w4z2n\") pod \"dnsmasq-dns-6dd886f455-5sn9b\" (UID: \"035372ef-7ca6-4afa-bcd7-1e56f5a79a0f\") " pod="openstack/dnsmasq-dns-6dd886f455-5sn9b" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.455020 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgkmh\" (UniqueName: \"kubernetes.io/projected/0db2d45b-2535-4e9b-8109-b3c8bde34391-kube-api-access-sgkmh\") pod \"dnsmasq-dns-77cbcfdc8f-v52m6\" (UID: \"0db2d45b-2535-4e9b-8109-b3c8bde34391\") " pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.455872 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035372ef-7ca6-4afa-bcd7-1e56f5a79a0f-config\") pod \"dnsmasq-dns-6dd886f455-5sn9b\" (UID: \"035372ef-7ca6-4afa-bcd7-1e56f5a79a0f\") " pod="openstack/dnsmasq-dns-6dd886f455-5sn9b" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.472617 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4z2n\" (UniqueName: \"kubernetes.io/projected/035372ef-7ca6-4afa-bcd7-1e56f5a79a0f-kube-api-access-w4z2n\") pod \"dnsmasq-dns-6dd886f455-5sn9b\" (UID: \"035372ef-7ca6-4afa-bcd7-1e56f5a79a0f\") " pod="openstack/dnsmasq-dns-6dd886f455-5sn9b" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.556177 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgkmh\" (UniqueName: \"kubernetes.io/projected/0db2d45b-2535-4e9b-8109-b3c8bde34391-kube-api-access-sgkmh\") pod \"dnsmasq-dns-77cbcfdc8f-v52m6\" (UID: \"0db2d45b-2535-4e9b-8109-b3c8bde34391\") " pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.556288 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0db2d45b-2535-4e9b-8109-b3c8bde34391-dns-svc\") pod \"dnsmasq-dns-77cbcfdc8f-v52m6\" (UID: \"0db2d45b-2535-4e9b-8109-b3c8bde34391\") " pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.556312 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0db2d45b-2535-4e9b-8109-b3c8bde34391-config\") pod \"dnsmasq-dns-77cbcfdc8f-v52m6\" (UID: \"0db2d45b-2535-4e9b-8109-b3c8bde34391\") " pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.557151 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0db2d45b-2535-4e9b-8109-b3c8bde34391-config\") pod \"dnsmasq-dns-77cbcfdc8f-v52m6\" (UID: \"0db2d45b-2535-4e9b-8109-b3c8bde34391\") " pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.557294 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0db2d45b-2535-4e9b-8109-b3c8bde34391-dns-svc\") pod \"dnsmasq-dns-77cbcfdc8f-v52m6\" (UID: \"0db2d45b-2535-4e9b-8109-b3c8bde34391\") " pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.575993 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgkmh\" (UniqueName: \"kubernetes.io/projected/0db2d45b-2535-4e9b-8109-b3c8bde34391-kube-api-access-sgkmh\") pod \"dnsmasq-dns-77cbcfdc8f-v52m6\" (UID: \"0db2d45b-2535-4e9b-8109-b3c8bde34391\") " pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.631259 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd886f455-5sn9b" Oct 01 13:10:10 crc kubenswrapper[4851]: I1001 13:10:10.719620 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" Oct 01 13:10:11 crc kubenswrapper[4851]: I1001 13:10:11.086382 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dd886f455-5sn9b"] Oct 01 13:10:11 crc kubenswrapper[4851]: I1001 13:10:11.146447 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77cbcfdc8f-v52m6"] Oct 01 13:10:11 crc kubenswrapper[4851]: W1001 13:10:11.153446 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db2d45b_2535_4e9b_8109_b3c8bde34391.slice/crio-da146c97d540f5be48ad583ca6d38b2ebf3622581970145c0e54c2b89ea173e8 WatchSource:0}: Error finding container da146c97d540f5be48ad583ca6d38b2ebf3622581970145c0e54c2b89ea173e8: Status 404 returned error can't find the container with id da146c97d540f5be48ad583ca6d38b2ebf3622581970145c0e54c2b89ea173e8 Oct 01 13:10:11 crc kubenswrapper[4851]: I1001 13:10:11.260415 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" event={"ID":"0db2d45b-2535-4e9b-8109-b3c8bde34391","Type":"ContainerStarted","Data":"da146c97d540f5be48ad583ca6d38b2ebf3622581970145c0e54c2b89ea173e8"} Oct 01 13:10:11 crc kubenswrapper[4851]: I1001 13:10:11.261836 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd886f455-5sn9b" event={"ID":"035372ef-7ca6-4afa-bcd7-1e56f5a79a0f","Type":"ContainerStarted","Data":"ac55713d100fb14d5de4d10c3e6611f8096ce1458b44c04a704c625b2d196815"} Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.304140 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dd886f455-5sn9b"] Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.337321 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57945c477-2qxhb"] Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.338540 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57945c477-2qxhb" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.348568 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57945c477-2qxhb"] Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.431722 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsnzw\" (UniqueName: \"kubernetes.io/projected/513e329d-81d9-448b-be02-43bf41b95f09-kube-api-access-tsnzw\") pod \"dnsmasq-dns-57945c477-2qxhb\" (UID: \"513e329d-81d9-448b-be02-43bf41b95f09\") " pod="openstack/dnsmasq-dns-57945c477-2qxhb" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.431806 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513e329d-81d9-448b-be02-43bf41b95f09-config\") pod \"dnsmasq-dns-57945c477-2qxhb\" (UID: \"513e329d-81d9-448b-be02-43bf41b95f09\") " pod="openstack/dnsmasq-dns-57945c477-2qxhb" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.431858 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/513e329d-81d9-448b-be02-43bf41b95f09-dns-svc\") pod \"dnsmasq-dns-57945c477-2qxhb\" (UID: \"513e329d-81d9-448b-be02-43bf41b95f09\") " pod="openstack/dnsmasq-dns-57945c477-2qxhb" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.533121 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/513e329d-81d9-448b-be02-43bf41b95f09-dns-svc\") pod \"dnsmasq-dns-57945c477-2qxhb\" (UID: \"513e329d-81d9-448b-be02-43bf41b95f09\") " pod="openstack/dnsmasq-dns-57945c477-2qxhb" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.533204 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsnzw\" (UniqueName: \"kubernetes.io/projected/513e329d-81d9-448b-be02-43bf41b95f09-kube-api-access-tsnzw\") pod \"dnsmasq-dns-57945c477-2qxhb\" (UID: \"513e329d-81d9-448b-be02-43bf41b95f09\") " pod="openstack/dnsmasq-dns-57945c477-2qxhb" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.533264 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513e329d-81d9-448b-be02-43bf41b95f09-config\") pod \"dnsmasq-dns-57945c477-2qxhb\" (UID: \"513e329d-81d9-448b-be02-43bf41b95f09\") " pod="openstack/dnsmasq-dns-57945c477-2qxhb" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.534251 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/513e329d-81d9-448b-be02-43bf41b95f09-dns-svc\") pod \"dnsmasq-dns-57945c477-2qxhb\" (UID: \"513e329d-81d9-448b-be02-43bf41b95f09\") " pod="openstack/dnsmasq-dns-57945c477-2qxhb" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.535149 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513e329d-81d9-448b-be02-43bf41b95f09-config\") pod \"dnsmasq-dns-57945c477-2qxhb\" (UID: \"513e329d-81d9-448b-be02-43bf41b95f09\") " pod="openstack/dnsmasq-dns-57945c477-2qxhb" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.567520 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsnzw\" (UniqueName: \"kubernetes.io/projected/513e329d-81d9-448b-be02-43bf41b95f09-kube-api-access-tsnzw\") pod \"dnsmasq-dns-57945c477-2qxhb\" (UID: \"513e329d-81d9-448b-be02-43bf41b95f09\") " pod="openstack/dnsmasq-dns-57945c477-2qxhb" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.620458 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77cbcfdc8f-v52m6"] Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.647199 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-749b584b69-tjnss"] Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.648346 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.656044 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57945c477-2qxhb" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.662360 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-749b584b69-tjnss"] Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.735871 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31bf58ed-cbe7-48da-a24a-677a859e487f-dns-svc\") pod \"dnsmasq-dns-749b584b69-tjnss\" (UID: \"31bf58ed-cbe7-48da-a24a-677a859e487f\") " pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.735941 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnkmg\" (UniqueName: \"kubernetes.io/projected/31bf58ed-cbe7-48da-a24a-677a859e487f-kube-api-access-bnkmg\") pod \"dnsmasq-dns-749b584b69-tjnss\" (UID: \"31bf58ed-cbe7-48da-a24a-677a859e487f\") " pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.735967 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31bf58ed-cbe7-48da-a24a-677a859e487f-config\") pod \"dnsmasq-dns-749b584b69-tjnss\" (UID: \"31bf58ed-cbe7-48da-a24a-677a859e487f\") " pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.836291 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnkmg\" (UniqueName: \"kubernetes.io/projected/31bf58ed-cbe7-48da-a24a-677a859e487f-kube-api-access-bnkmg\") pod \"dnsmasq-dns-749b584b69-tjnss\" (UID: \"31bf58ed-cbe7-48da-a24a-677a859e487f\") " pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.836338 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31bf58ed-cbe7-48da-a24a-677a859e487f-config\") pod \"dnsmasq-dns-749b584b69-tjnss\" (UID: \"31bf58ed-cbe7-48da-a24a-677a859e487f\") " pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.836404 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31bf58ed-cbe7-48da-a24a-677a859e487f-dns-svc\") pod \"dnsmasq-dns-749b584b69-tjnss\" (UID: \"31bf58ed-cbe7-48da-a24a-677a859e487f\") " pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.837273 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31bf58ed-cbe7-48da-a24a-677a859e487f-dns-svc\") pod \"dnsmasq-dns-749b584b69-tjnss\" (UID: \"31bf58ed-cbe7-48da-a24a-677a859e487f\") " pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.837301 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31bf58ed-cbe7-48da-a24a-677a859e487f-config\") pod \"dnsmasq-dns-749b584b69-tjnss\" (UID: \"31bf58ed-cbe7-48da-a24a-677a859e487f\") " pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.854436 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnkmg\" (UniqueName: \"kubernetes.io/projected/31bf58ed-cbe7-48da-a24a-677a859e487f-kube-api-access-bnkmg\") pod \"dnsmasq-dns-749b584b69-tjnss\" (UID: \"31bf58ed-cbe7-48da-a24a-677a859e487f\") " pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.912962 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57945c477-2qxhb"] Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.933159 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56595f9cf7-29fbx"] Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.934324 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.937130 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af2859ca-d690-49fb-8dd6-2b50441aa577-dns-svc\") pod \"dnsmasq-dns-56595f9cf7-29fbx\" (UID: \"af2859ca-d690-49fb-8dd6-2b50441aa577\") " pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.937247 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n7vz\" (UniqueName: \"kubernetes.io/projected/af2859ca-d690-49fb-8dd6-2b50441aa577-kube-api-access-9n7vz\") pod \"dnsmasq-dns-56595f9cf7-29fbx\" (UID: \"af2859ca-d690-49fb-8dd6-2b50441aa577\") " pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.937284 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af2859ca-d690-49fb-8dd6-2b50441aa577-config\") pod \"dnsmasq-dns-56595f9cf7-29fbx\" (UID: \"af2859ca-d690-49fb-8dd6-2b50441aa577\") " pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.950797 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56595f9cf7-29fbx"] Oct 01 13:10:14 crc kubenswrapper[4851]: I1001 13:10:14.964578 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.042914 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af2859ca-d690-49fb-8dd6-2b50441aa577-dns-svc\") pod \"dnsmasq-dns-56595f9cf7-29fbx\" (UID: \"af2859ca-d690-49fb-8dd6-2b50441aa577\") " pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.042996 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n7vz\" (UniqueName: \"kubernetes.io/projected/af2859ca-d690-49fb-8dd6-2b50441aa577-kube-api-access-9n7vz\") pod \"dnsmasq-dns-56595f9cf7-29fbx\" (UID: \"af2859ca-d690-49fb-8dd6-2b50441aa577\") " pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.043015 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af2859ca-d690-49fb-8dd6-2b50441aa577-config\") pod \"dnsmasq-dns-56595f9cf7-29fbx\" (UID: \"af2859ca-d690-49fb-8dd6-2b50441aa577\") " pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.043761 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af2859ca-d690-49fb-8dd6-2b50441aa577-dns-svc\") pod \"dnsmasq-dns-56595f9cf7-29fbx\" (UID: \"af2859ca-d690-49fb-8dd6-2b50441aa577\") " pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.043872 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af2859ca-d690-49fb-8dd6-2b50441aa577-config\") pod \"dnsmasq-dns-56595f9cf7-29fbx\" (UID: \"af2859ca-d690-49fb-8dd6-2b50441aa577\") " pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.073535 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n7vz\" (UniqueName: \"kubernetes.io/projected/af2859ca-d690-49fb-8dd6-2b50441aa577-kube-api-access-9n7vz\") pod \"dnsmasq-dns-56595f9cf7-29fbx\" (UID: \"af2859ca-d690-49fb-8dd6-2b50441aa577\") " pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.268230 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.465872 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.467322 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.469703 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.469881 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.470035 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-dqh5p" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.470257 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.470400 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.477598 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.477822 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.484888 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.656235 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.656299 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.656330 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.656371 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/95c53639-696f-4d10-a297-7173dd3b394f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.656400 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.656795 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/95c53639-696f-4d10-a297-7173dd3b394f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.656999 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.657049 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.657083 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgmt5\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-kube-api-access-mgmt5\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.657160 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.657282 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.754671 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.756595 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.759024 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.759077 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.759097 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgmt5\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-kube-api-access-mgmt5\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.759120 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.759153 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.759184 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.759201 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.759216 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.759244 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/95c53639-696f-4d10-a297-7173dd3b394f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.759526 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.759470 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.759712 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/95c53639-696f-4d10-a297-7173dd3b394f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.760213 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.760268 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.761459 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.761516 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.764097 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.774337 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.775139 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/95c53639-696f-4d10-a297-7173dd3b394f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.775726 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/95c53639-696f-4d10-a297-7173dd3b394f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.776537 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.776910 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.777429 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.781058 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.781148 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.785673 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.786303 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-44bqn" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.788790 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.798321 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgmt5\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-kube-api-access-mgmt5\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.802507 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.821231 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.842450 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.867784 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.867852 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.867906 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwltf\" (UniqueName: \"kubernetes.io/projected/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-kube-api-access-lwltf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.867937 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.867959 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.867992 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.868025 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.868053 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.868092 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.868136 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.868158 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.970287 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.970391 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.970443 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.970520 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.970575 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwltf\" (UniqueName: \"kubernetes.io/projected/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-kube-api-access-lwltf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.970614 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.970638 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.970673 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.970706 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.970738 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.970767 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.970926 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.971084 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.971770 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.972069 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.972459 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.974612 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.974650 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.975385 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.977332 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.979971 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.989414 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwltf\" (UniqueName: \"kubernetes.io/projected/8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb-kube-api-access-lwltf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:15 crc kubenswrapper[4851]: I1001 13:10:15.994328 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.067752 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.071370 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.079645 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.079963 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.080087 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rx8zc" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.080286 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.080410 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.080463 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.080647 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.085381 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.175257 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/712c3704-a775-4fac-81d6-9aa9cfdc48ef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.175388 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.175451 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.175507 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.175528 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.175548 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.175568 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.175590 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/712c3704-a775-4fac-81d6-9aa9cfdc48ef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.175615 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9x7g\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-kube-api-access-p9x7g\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.175643 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-config-data\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.175665 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.181290 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.277620 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/712c3704-a775-4fac-81d6-9aa9cfdc48ef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.278676 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.278753 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.278837 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.278860 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.278889 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.278908 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.278931 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/712c3704-a775-4fac-81d6-9aa9cfdc48ef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.279016 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9x7g\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-kube-api-access-p9x7g\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.279041 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.279078 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-config-data\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.279171 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.279425 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.280109 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-config-data\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.280358 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.280447 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.281181 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.282740 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/712c3704-a775-4fac-81d6-9aa9cfdc48ef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.282829 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.284042 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.288187 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/712c3704-a775-4fac-81d6-9aa9cfdc48ef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.296190 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9x7g\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-kube-api-access-p9x7g\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.308789 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " pod="openstack/rabbitmq-server-0" Oct 01 13:10:16 crc kubenswrapper[4851]: I1001 13:10:16.404452 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.086350 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.088405 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.091268 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.092569 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.092897 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.093087 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.094105 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-x7vhn" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.102133 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.103358 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.132110 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.133748 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.137228 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-ngb7d" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.137390 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.137392 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.137743 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.170629 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235224 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235295 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235339 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235361 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311e8f50-9e4a-4a03-bc24-04b76d53a238-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235385 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-kolla-config\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235431 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235448 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/311e8f50-9e4a-4a03-bc24-04b76d53a238-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235472 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-config-data-default\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235705 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235783 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/311e8f50-9e4a-4a03-bc24-04b76d53a238-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235818 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/311e8f50-9e4a-4a03-bc24-04b76d53a238-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235861 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/311e8f50-9e4a-4a03-bc24-04b76d53a238-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235921 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-secrets\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235961 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/311e8f50-9e4a-4a03-bc24-04b76d53a238-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.235993 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/311e8f50-9e4a-4a03-bc24-04b76d53a238-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.236070 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf7dj\" (UniqueName: \"kubernetes.io/projected/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-kube-api-access-kf7dj\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.236175 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.236216 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj4wv\" (UniqueName: \"kubernetes.io/projected/311e8f50-9e4a-4a03-bc24-04b76d53a238-kube-api-access-kj4wv\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337363 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-secrets\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337403 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/311e8f50-9e4a-4a03-bc24-04b76d53a238-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337420 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/311e8f50-9e4a-4a03-bc24-04b76d53a238-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337446 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf7dj\" (UniqueName: \"kubernetes.io/projected/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-kube-api-access-kf7dj\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337479 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337518 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj4wv\" (UniqueName: \"kubernetes.io/projected/311e8f50-9e4a-4a03-bc24-04b76d53a238-kube-api-access-kj4wv\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337556 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337587 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337608 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337665 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311e8f50-9e4a-4a03-bc24-04b76d53a238-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337690 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-kolla-config\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337721 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337739 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/311e8f50-9e4a-4a03-bc24-04b76d53a238-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337761 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-config-data-default\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337789 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337808 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/311e8f50-9e4a-4a03-bc24-04b76d53a238-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337823 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/311e8f50-9e4a-4a03-bc24-04b76d53a238-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.337838 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/311e8f50-9e4a-4a03-bc24-04b76d53a238-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.338379 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/311e8f50-9e4a-4a03-bc24-04b76d53a238-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.338594 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.342740 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.342769 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.343437 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/311e8f50-9e4a-4a03-bc24-04b76d53a238-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.343715 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/311e8f50-9e4a-4a03-bc24-04b76d53a238-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.344004 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311e8f50-9e4a-4a03-bc24-04b76d53a238-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.344291 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/311e8f50-9e4a-4a03-bc24-04b76d53a238-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.346463 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-config-data-default\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.346791 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.350306 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/311e8f50-9e4a-4a03-bc24-04b76d53a238-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.351396 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.354252 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/311e8f50-9e4a-4a03-bc24-04b76d53a238-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.354678 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-kolla-config\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.360403 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-secrets\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.362079 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf7dj\" (UniqueName: \"kubernetes.io/projected/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-kube-api-access-kf7dj\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.362296 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c2ac1cc-c49c-4966-a357-2d1ba04d5671-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.368062 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj4wv\" (UniqueName: \"kubernetes.io/projected/311e8f50-9e4a-4a03-bc24-04b76d53a238-kube-api-access-kj4wv\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.368640 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"311e8f50-9e4a-4a03-bc24-04b76d53a238\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.372761 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"7c2ac1cc-c49c-4966-a357-2d1ba04d5671\") " pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.440771 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.450190 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.750479 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.751428 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.758707 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.758762 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.759325 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nmbn2" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.766377 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.844481 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/705d247f-3afa-49f5-ba1d-ab991af3e399-kolla-config\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.844535 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2jfq\" (UniqueName: \"kubernetes.io/projected/705d247f-3afa-49f5-ba1d-ab991af3e399-kube-api-access-m2jfq\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.844562 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/705d247f-3afa-49f5-ba1d-ab991af3e399-config-data\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.844593 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/705d247f-3afa-49f5-ba1d-ab991af3e399-memcached-tls-certs\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.844615 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705d247f-3afa-49f5-ba1d-ab991af3e399-combined-ca-bundle\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.946114 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/705d247f-3afa-49f5-ba1d-ab991af3e399-config-data\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.946170 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/705d247f-3afa-49f5-ba1d-ab991af3e399-memcached-tls-certs\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.946199 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705d247f-3afa-49f5-ba1d-ab991af3e399-combined-ca-bundle\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.946283 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/705d247f-3afa-49f5-ba1d-ab991af3e399-kolla-config\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.946300 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2jfq\" (UniqueName: \"kubernetes.io/projected/705d247f-3afa-49f5-ba1d-ab991af3e399-kube-api-access-m2jfq\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.947258 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/705d247f-3afa-49f5-ba1d-ab991af3e399-config-data\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.948132 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/705d247f-3afa-49f5-ba1d-ab991af3e399-kolla-config\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.955514 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/705d247f-3afa-49f5-ba1d-ab991af3e399-memcached-tls-certs\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.955980 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705d247f-3afa-49f5-ba1d-ab991af3e399-combined-ca-bundle\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:19 crc kubenswrapper[4851]: I1001 13:10:19.964470 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2jfq\" (UniqueName: \"kubernetes.io/projected/705d247f-3afa-49f5-ba1d-ab991af3e399-kube-api-access-m2jfq\") pod \"memcached-0\" (UID: \"705d247f-3afa-49f5-ba1d-ab991af3e399\") " pod="openstack/memcached-0" Oct 01 13:10:20 crc kubenswrapper[4851]: I1001 13:10:20.078589 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 13:10:21 crc kubenswrapper[4851]: I1001 13:10:21.529052 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:10:21 crc kubenswrapper[4851]: I1001 13:10:21.530358 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 13:10:21 crc kubenswrapper[4851]: I1001 13:10:21.535910 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-zq9hb" Oct 01 13:10:21 crc kubenswrapper[4851]: I1001 13:10:21.542978 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:10:21 crc kubenswrapper[4851]: I1001 13:10:21.568268 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drpnd\" (UniqueName: \"kubernetes.io/projected/5dbbdd15-a094-4982-a2f1-370f00a1b004-kube-api-access-drpnd\") pod \"kube-state-metrics-0\" (UID: \"5dbbdd15-a094-4982-a2f1-370f00a1b004\") " pod="openstack/kube-state-metrics-0" Oct 01 13:10:21 crc kubenswrapper[4851]: I1001 13:10:21.669619 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drpnd\" (UniqueName: \"kubernetes.io/projected/5dbbdd15-a094-4982-a2f1-370f00a1b004-kube-api-access-drpnd\") pod \"kube-state-metrics-0\" (UID: \"5dbbdd15-a094-4982-a2f1-370f00a1b004\") " pod="openstack/kube-state-metrics-0" Oct 01 13:10:21 crc kubenswrapper[4851]: I1001 13:10:21.692469 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drpnd\" (UniqueName: \"kubernetes.io/projected/5dbbdd15-a094-4982-a2f1-370f00a1b004-kube-api-access-drpnd\") pod \"kube-state-metrics-0\" (UID: \"5dbbdd15-a094-4982-a2f1-370f00a1b004\") " pod="openstack/kube-state-metrics-0" Oct 01 13:10:21 crc kubenswrapper[4851]: I1001 13:10:21.852897 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.860161 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.866033 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.868320 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.868726 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.870017 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.872373 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-vmjkn" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.873380 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.873910 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.875964 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.994554 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h97sc\" (UniqueName: \"kubernetes.io/projected/486b3f7c-3593-4950-b470-2d0a2f037e2f-kube-api-access-h97sc\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.994883 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.994989 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/486b3f7c-3593-4950-b470-2d0a2f037e2f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.995096 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.995201 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/486b3f7c-3593-4950-b470-2d0a2f037e2f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.995314 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.995419 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-config\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:22 crc kubenswrapper[4851]: I1001 13:10:22.995533 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/486b3f7c-3593-4950-b470-2d0a2f037e2f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.096648 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/486b3f7c-3593-4950-b470-2d0a2f037e2f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.096706 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.096756 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-config\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.096789 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/486b3f7c-3593-4950-b470-2d0a2f037e2f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.096810 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h97sc\" (UniqueName: \"kubernetes.io/projected/486b3f7c-3593-4950-b470-2d0a2f037e2f-kube-api-access-h97sc\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.096829 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.096861 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/486b3f7c-3593-4950-b470-2d0a2f037e2f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.096876 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.101121 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/486b3f7c-3593-4950-b470-2d0a2f037e2f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.101578 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.104594 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-config\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.104615 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/486b3f7c-3593-4950-b470-2d0a2f037e2f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.107638 4851 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.107678 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f6a0e30b7a50d8862a9481345085f62592a4f2276fdfe80014a12770adb24140/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.108828 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/486b3f7c-3593-4950-b470-2d0a2f037e2f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.118660 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h97sc\" (UniqueName: \"kubernetes.io/projected/486b3f7c-3593-4950-b470-2d0a2f037e2f-kube-api-access-h97sc\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.126009 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.149428 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") pod \"prometheus-metric-storage-0\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:23 crc kubenswrapper[4851]: I1001 13:10:23.217780 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.759485 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dmq5k"] Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.761002 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.764171 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-s2fq8" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.764391 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.764773 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.772190 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dmq5k"] Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.783883 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-gt7fw"] Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.785621 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.821887 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gt7fw"] Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.825164 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-var-log\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.825222 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-scripts\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.825243 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b15e5b7-162e-46ee-a292-bf763704cda6-var-run-ovn\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.825290 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7b15e5b7-162e-46ee-a292-bf763704cda6-var-run\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.825325 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgxl6\" (UniqueName: \"kubernetes.io/projected/7b15e5b7-162e-46ee-a292-bf763704cda6-kube-api-access-dgxl6\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.825368 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15e5b7-162e-46ee-a292-bf763704cda6-combined-ca-bundle\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.825389 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b15e5b7-162e-46ee-a292-bf763704cda6-scripts\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.825455 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-etc-ovs\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.825643 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7b15e5b7-162e-46ee-a292-bf763704cda6-var-log-ovn\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.825697 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-var-run\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.825775 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-var-lib\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.825813 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsk6b\" (UniqueName: \"kubernetes.io/projected/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-kube-api-access-nsk6b\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.825917 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b15e5b7-162e-46ee-a292-bf763704cda6-ovn-controller-tls-certs\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.927541 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7b15e5b7-162e-46ee-a292-bf763704cda6-var-log-ovn\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.927592 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-var-run\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.927624 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-var-lib\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.927649 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsk6b\" (UniqueName: \"kubernetes.io/projected/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-kube-api-access-nsk6b\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.927697 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b15e5b7-162e-46ee-a292-bf763704cda6-ovn-controller-tls-certs\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.927730 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-var-log\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.927766 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-scripts\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.927797 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b15e5b7-162e-46ee-a292-bf763704cda6-var-run-ovn\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.927826 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7b15e5b7-162e-46ee-a292-bf763704cda6-var-run\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.927871 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgxl6\" (UniqueName: \"kubernetes.io/projected/7b15e5b7-162e-46ee-a292-bf763704cda6-kube-api-access-dgxl6\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.927919 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15e5b7-162e-46ee-a292-bf763704cda6-combined-ca-bundle\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.927950 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b15e5b7-162e-46ee-a292-bf763704cda6-scripts\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.927982 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-etc-ovs\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.928409 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-etc-ovs\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.928407 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-var-run\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.928587 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7b15e5b7-162e-46ee-a292-bf763704cda6-var-log-ovn\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.928644 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-var-log\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.928694 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7b15e5b7-162e-46ee-a292-bf763704cda6-var-run\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.929514 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-var-lib\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.930639 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b15e5b7-162e-46ee-a292-bf763704cda6-var-run-ovn\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.933364 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-scripts\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.936524 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15e5b7-162e-46ee-a292-bf763704cda6-combined-ca-bundle\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.940879 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b15e5b7-162e-46ee-a292-bf763704cda6-scripts\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.952642 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b15e5b7-162e-46ee-a292-bf763704cda6-ovn-controller-tls-certs\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.954834 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgxl6\" (UniqueName: \"kubernetes.io/projected/7b15e5b7-162e-46ee-a292-bf763704cda6-kube-api-access-dgxl6\") pod \"ovn-controller-dmq5k\" (UID: \"7b15e5b7-162e-46ee-a292-bf763704cda6\") " pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:24 crc kubenswrapper[4851]: I1001 13:10:24.956912 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsk6b\" (UniqueName: \"kubernetes.io/projected/928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22-kube-api-access-nsk6b\") pod \"ovn-controller-ovs-gt7fw\" (UID: \"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22\") " pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:25 crc kubenswrapper[4851]: I1001 13:10:25.086067 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:25 crc kubenswrapper[4851]: I1001 13:10:25.108436 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.227685 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.229147 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.234295 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-9jm9w" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.234314 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.234384 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.235280 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.235656 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.237008 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.352858 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.352921 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-config\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.352997 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwxwj\" (UniqueName: \"kubernetes.io/projected/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-kube-api-access-lwxwj\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.353015 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.353043 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.353103 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.353797 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.353836 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.454953 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.455014 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.455037 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.455059 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.455109 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.455130 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-config\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.455173 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwxwj\" (UniqueName: \"kubernetes.io/projected/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-kube-api-access-lwxwj\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.455188 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.455480 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.456352 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.456571 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.457102 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-config\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.462192 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.464169 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.464648 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.475603 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwxwj\" (UniqueName: \"kubernetes.io/projected/c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f-kube-api-access-lwxwj\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.478601 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:26 crc kubenswrapper[4851]: I1001 13:10:26.564463 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.867200 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.870167 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.876522 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.877690 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.882735 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.884433 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.885618 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-wrbkh" Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.994743 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3739abde-3ff0-4c31-aeb6-f731eb37dfac-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.994814 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3739abde-3ff0-4c31-aeb6-f731eb37dfac-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.994841 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g64n4\" (UniqueName: \"kubernetes.io/projected/3739abde-3ff0-4c31-aeb6-f731eb37dfac-kube-api-access-g64n4\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.994869 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3739abde-3ff0-4c31-aeb6-f731eb37dfac-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.995438 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3739abde-3ff0-4c31-aeb6-f731eb37dfac-config\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.995554 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.995597 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3739abde-3ff0-4c31-aeb6-f731eb37dfac-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:28 crc kubenswrapper[4851]: I1001 13:10:28.995663 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3739abde-3ff0-4c31-aeb6-f731eb37dfac-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.097932 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3739abde-3ff0-4c31-aeb6-f731eb37dfac-config\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.098017 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.098059 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3739abde-3ff0-4c31-aeb6-f731eb37dfac-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.098122 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3739abde-3ff0-4c31-aeb6-f731eb37dfac-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.098251 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3739abde-3ff0-4c31-aeb6-f731eb37dfac-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.098288 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3739abde-3ff0-4c31-aeb6-f731eb37dfac-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.098310 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g64n4\" (UniqueName: \"kubernetes.io/projected/3739abde-3ff0-4c31-aeb6-f731eb37dfac-kube-api-access-g64n4\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.098334 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3739abde-3ff0-4c31-aeb6-f731eb37dfac-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.098433 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.099769 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3739abde-3ff0-4c31-aeb6-f731eb37dfac-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.099886 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3739abde-3ff0-4c31-aeb6-f731eb37dfac-config\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.101251 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3739abde-3ff0-4c31-aeb6-f731eb37dfac-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.108402 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3739abde-3ff0-4c31-aeb6-f731eb37dfac-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.108443 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3739abde-3ff0-4c31-aeb6-f731eb37dfac-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.115680 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3739abde-3ff0-4c31-aeb6-f731eb37dfac-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.124412 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g64n4\" (UniqueName: \"kubernetes.io/projected/3739abde-3ff0-4c31-aeb6-f731eb37dfac-kube-api-access-g64n4\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.128272 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3739abde-3ff0-4c31-aeb6-f731eb37dfac\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:29 crc kubenswrapper[4851]: I1001 13:10:29.200368 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:30 crc kubenswrapper[4851]: E1001 13:10:30.677150 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 01 13:10:30 crc kubenswrapper[4851]: E1001 13:10:30.677673 4851 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 01 13:10:30 crc kubenswrapper[4851]: E1001 13:10:30.677940 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.36:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4z2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6dd886f455-5sn9b_openstack(035372ef-7ca6-4afa-bcd7-1e56f5a79a0f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:10:30 crc kubenswrapper[4851]: E1001 13:10:30.679282 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6dd886f455-5sn9b" podUID="035372ef-7ca6-4afa-bcd7-1e56f5a79a0f" Oct 01 13:10:30 crc kubenswrapper[4851]: E1001 13:10:30.739672 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 01 13:10:30 crc kubenswrapper[4851]: E1001 13:10:30.739717 4851 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 01 13:10:30 crc kubenswrapper[4851]: E1001 13:10:30.739832 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.36:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgkmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-77cbcfdc8f-v52m6_openstack(0db2d45b-2535-4e9b-8109-b3c8bde34391): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:10:30 crc kubenswrapper[4851]: E1001 13:10:30.741072 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" podUID="0db2d45b-2535-4e9b-8109-b3c8bde34391" Oct 01 13:10:31 crc kubenswrapper[4851]: I1001 13:10:31.344892 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 01 13:10:31 crc kubenswrapper[4851]: W1001 13:10:31.349559 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cae03b9_aeea_4fc7_90b0_7c0aad42f8fb.slice/crio-bb30dd1511d45b4f34039d9d34f4bbdc7119eb6829d6490095b44e4eb2813dc4 WatchSource:0}: Error finding container bb30dd1511d45b4f34039d9d34f4bbdc7119eb6829d6490095b44e4eb2813dc4: Status 404 returned error can't find the container with id bb30dd1511d45b4f34039d9d34f4bbdc7119eb6829d6490095b44e4eb2813dc4 Oct 01 13:10:31 crc kubenswrapper[4851]: I1001 13:10:31.478375 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb","Type":"ContainerStarted","Data":"bb30dd1511d45b4f34039d9d34f4bbdc7119eb6829d6490095b44e4eb2813dc4"} Oct 01 13:10:31 crc kubenswrapper[4851]: I1001 13:10:31.913365 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd886f455-5sn9b" Oct 01 13:10:31 crc kubenswrapper[4851]: I1001 13:10:31.920297 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.024975 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.034518 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57945c477-2qxhb"] Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.042550 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dmq5k"] Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.051760 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4z2n\" (UniqueName: \"kubernetes.io/projected/035372ef-7ca6-4afa-bcd7-1e56f5a79a0f-kube-api-access-w4z2n\") pod \"035372ef-7ca6-4afa-bcd7-1e56f5a79a0f\" (UID: \"035372ef-7ca6-4afa-bcd7-1e56f5a79a0f\") " Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.051869 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0db2d45b-2535-4e9b-8109-b3c8bde34391-dns-svc\") pod \"0db2d45b-2535-4e9b-8109-b3c8bde34391\" (UID: \"0db2d45b-2535-4e9b-8109-b3c8bde34391\") " Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.051901 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0db2d45b-2535-4e9b-8109-b3c8bde34391-config\") pod \"0db2d45b-2535-4e9b-8109-b3c8bde34391\" (UID: \"0db2d45b-2535-4e9b-8109-b3c8bde34391\") " Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.051976 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035372ef-7ca6-4afa-bcd7-1e56f5a79a0f-config\") pod \"035372ef-7ca6-4afa-bcd7-1e56f5a79a0f\" (UID: \"035372ef-7ca6-4afa-bcd7-1e56f5a79a0f\") " Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.052074 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgkmh\" (UniqueName: \"kubernetes.io/projected/0db2d45b-2535-4e9b-8109-b3c8bde34391-kube-api-access-sgkmh\") pod \"0db2d45b-2535-4e9b-8109-b3c8bde34391\" (UID: \"0db2d45b-2535-4e9b-8109-b3c8bde34391\") " Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.054428 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0db2d45b-2535-4e9b-8109-b3c8bde34391-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0db2d45b-2535-4e9b-8109-b3c8bde34391" (UID: "0db2d45b-2535-4e9b-8109-b3c8bde34391"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.054838 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035372ef-7ca6-4afa-bcd7-1e56f5a79a0f-config" (OuterVolumeSpecName: "config") pod "035372ef-7ca6-4afa-bcd7-1e56f5a79a0f" (UID: "035372ef-7ca6-4afa-bcd7-1e56f5a79a0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.055464 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0db2d45b-2535-4e9b-8109-b3c8bde34391-config" (OuterVolumeSpecName: "config") pod "0db2d45b-2535-4e9b-8109-b3c8bde34391" (UID: "0db2d45b-2535-4e9b-8109-b3c8bde34391"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.059013 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/035372ef-7ca6-4afa-bcd7-1e56f5a79a0f-kube-api-access-w4z2n" (OuterVolumeSpecName: "kube-api-access-w4z2n") pod "035372ef-7ca6-4afa-bcd7-1e56f5a79a0f" (UID: "035372ef-7ca6-4afa-bcd7-1e56f5a79a0f"). InnerVolumeSpecName "kube-api-access-w4z2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.063720 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56595f9cf7-29fbx"] Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.066818 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0db2d45b-2535-4e9b-8109-b3c8bde34391-kube-api-access-sgkmh" (OuterVolumeSpecName: "kube-api-access-sgkmh") pod "0db2d45b-2535-4e9b-8109-b3c8bde34391" (UID: "0db2d45b-2535-4e9b-8109-b3c8bde34391"). InnerVolumeSpecName "kube-api-access-sgkmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.076437 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.087605 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.096743 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 13:10:32 crc kubenswrapper[4851]: W1001 13:10:32.108215 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95c53639_696f_4d10_a297_7173dd3b394f.slice/crio-c2a1ac02edab40d1ace222280655edd43af85e0df391ea83b621a64b5849bc71 WatchSource:0}: Error finding container c2a1ac02edab40d1ace222280655edd43af85e0df391ea83b621a64b5849bc71: Status 404 returned error can't find the container with id c2a1ac02edab40d1ace222280655edd43af85e0df391ea83b621a64b5849bc71 Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.108808 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-749b584b69-tjnss"] Oct 01 13:10:32 crc kubenswrapper[4851]: W1001 13:10:32.110772 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod705d247f_3afa_49f5_ba1d_ab991af3e399.slice/crio-1649ecd3aabfe85624fc4ad7e45be6084bcbc5553d07984ce6780f213b6a8054 WatchSource:0}: Error finding container 1649ecd3aabfe85624fc4ad7e45be6084bcbc5553d07984ce6780f213b6a8054: Status 404 returned error can't find the container with id 1649ecd3aabfe85624fc4ad7e45be6084bcbc5553d07984ce6780f213b6a8054 Oct 01 13:10:32 crc kubenswrapper[4851]: W1001 13:10:32.113084 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31bf58ed_cbe7_48da_a24a_677a859e487f.slice/crio-1c1f13667bb3873c864fe744e6ae157d9092d695ad95f07d8a5482b03e91b739 WatchSource:0}: Error finding container 1c1f13667bb3873c864fe744e6ae157d9092d695ad95f07d8a5482b03e91b739: Status 404 returned error can't find the container with id 1c1f13667bb3873c864fe744e6ae157d9092d695ad95f07d8a5482b03e91b739 Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.151812 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.154726 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgkmh\" (UniqueName: \"kubernetes.io/projected/0db2d45b-2535-4e9b-8109-b3c8bde34391-kube-api-access-sgkmh\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.154745 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4z2n\" (UniqueName: \"kubernetes.io/projected/035372ef-7ca6-4afa-bcd7-1e56f5a79a0f-kube-api-access-w4z2n\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.154756 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0db2d45b-2535-4e9b-8109-b3c8bde34391-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.154764 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0db2d45b-2535-4e9b-8109-b3c8bde34391-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.154772 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035372ef-7ca6-4afa-bcd7-1e56f5a79a0f-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:32 crc kubenswrapper[4851]: W1001 13:10:32.160064 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3e4f5ad_f2e6_4a32_a900_bba6cc4ac88f.slice/crio-1260d5a1b699797a6edd45cd5fcbbabae686ab52b6b7e6eb773d4c085ae5e928 WatchSource:0}: Error finding container 1260d5a1b699797a6edd45cd5fcbbabae686ab52b6b7e6eb773d4c085ae5e928: Status 404 returned error can't find the container with id 1260d5a1b699797a6edd45cd5fcbbabae686ab52b6b7e6eb773d4c085ae5e928 Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.184688 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.210616 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 13:10:32 crc kubenswrapper[4851]: W1001 13:10:32.221101 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c2ac1cc_c49c_4966_a357_2d1ba04d5671.slice/crio-29865636d02eed7e3a231cad9fbaab281f3a223fc617697aa5f39860eac2c54c WatchSource:0}: Error finding container 29865636d02eed7e3a231cad9fbaab281f3a223fc617697aa5f39860eac2c54c: Status 404 returned error can't find the container with id 29865636d02eed7e3a231cad9fbaab281f3a223fc617697aa5f39860eac2c54c Oct 01 13:10:32 crc kubenswrapper[4851]: E1001 13:10:32.228741 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:38.102.83.36:5001/podified-master-centos10/openstack-mariadb:watcher_latest,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kf7dj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(7c2ac1cc-c49c-4966-a357-2d1ba04d5671): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:10:32 crc kubenswrapper[4851]: E1001 13:10:32.230080 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/openstack-galera-0" podUID="7c2ac1cc-c49c-4966-a357-2d1ba04d5671" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.240162 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gt7fw"] Oct 01 13:10:32 crc kubenswrapper[4851]: W1001 13:10:32.240797 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod928b6ee6_bba0_4da7_a3d5_80b9bcfdfb22.slice/crio-81093397387ac04529456a12670ab05f84d3a78b539477685ec32c66eac258dc WatchSource:0}: Error finding container 81093397387ac04529456a12670ab05f84d3a78b539477685ec32c66eac258dc: Status 404 returned error can't find the container with id 81093397387ac04529456a12670ab05f84d3a78b539477685ec32c66eac258dc Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.249650 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:10:32 crc kubenswrapper[4851]: E1001 13:10:32.253915 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:38.102.83.36:5001/podified-master-centos10/openstack-ovn-base:watcher_latest,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cbh5f8hc7h5cfh86h9dhf6h58fhdfh58dh58ch56chfbh567h8ch5f9h56ch546h64h5c5h546h86h5dh549h5fch64fh64ch574h7hc7h5f9h5cfq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nsk6b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-gt7fw_openstack(928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:10:32 crc kubenswrapper[4851]: W1001 13:10:32.254875 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod486b3f7c_3593_4950_b470_2d0a2f037e2f.slice/crio-fc55e2cc6f77556bb8856c85cf003505683d767035c8b0cef7bbdc828998e098 WatchSource:0}: Error finding container fc55e2cc6f77556bb8856c85cf003505683d767035c8b0cef7bbdc828998e098: Status 404 returned error can't find the container with id fc55e2cc6f77556bb8856c85cf003505683d767035c8b0cef7bbdc828998e098 Oct 01 13:10:32 crc kubenswrapper[4851]: E1001 13:10:32.255781 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/ovn-controller-ovs-gt7fw" podUID="928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22" Oct 01 13:10:32 crc kubenswrapper[4851]: E1001 13:10:32.261608 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init-config-reloader,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:8597c48fc71fc6ec8e87dbe40dace4dbb7b817c1039db608af76a0d90f7ac2d0,Command:[/bin/prometheus-config-reloader],Args:[--watch-interval=0 --listen-address=:8081 --config-file=/etc/prometheus/config/prometheus.yaml.gz --config-envsubst-file=/etc/prometheus/config_out/prometheus.env.yaml --watched-dir=/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:reloader-web,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:SHARD,Value:0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/prometheus/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-out,ReadOnly:false,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h97sc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(486b3f7c-3593-4950-b470-2d0a2f037e2f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:10:32 crc kubenswrapper[4851]: E1001 13:10:32.262803 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-config-reloader\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/prometheus-metric-storage-0" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.322322 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.490569 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd886f455-5sn9b" event={"ID":"035372ef-7ca6-4afa-bcd7-1e56f5a79a0f","Type":"ContainerDied","Data":"ac55713d100fb14d5de4d10c3e6611f8096ce1458b44c04a704c625b2d196815"} Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.490649 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd886f455-5sn9b" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.493390 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gt7fw" event={"ID":"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22","Type":"ContainerStarted","Data":"81093397387ac04529456a12670ab05f84d3a78b539477685ec32c66eac258dc"} Oct 01 13:10:32 crc kubenswrapper[4851]: E1001 13:10:32.494996 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/podified-master-centos10/openstack-ovn-base:watcher_latest\\\"\"" pod="openstack/ovn-controller-ovs-gt7fw" podUID="928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.495075 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"705d247f-3afa-49f5-ba1d-ab991af3e399","Type":"ContainerStarted","Data":"1649ecd3aabfe85624fc4ad7e45be6084bcbc5553d07984ce6780f213b6a8054"} Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.496042 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"486b3f7c-3593-4950-b470-2d0a2f037e2f","Type":"ContainerStarted","Data":"fc55e2cc6f77556bb8856c85cf003505683d767035c8b0cef7bbdc828998e098"} Oct 01 13:10:32 crc kubenswrapper[4851]: E1001 13:10:32.498746 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-config-reloader\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:8597c48fc71fc6ec8e87dbe40dace4dbb7b817c1039db608af76a0d90f7ac2d0\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.499312 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dmq5k" event={"ID":"7b15e5b7-162e-46ee-a292-bf763704cda6","Type":"ContainerStarted","Data":"950c56340411df170e0bee370d0d4d2aca10e135210159b4e2f0aa42c3514399"} Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.500998 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5dbbdd15-a094-4982-a2f1-370f00a1b004","Type":"ContainerStarted","Data":"52d7eafdfc14345b22576837949d2cb2cb47e0fc0307800620f2901ce81bf079"} Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.506918 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"311e8f50-9e4a-4a03-bc24-04b76d53a238","Type":"ContainerStarted","Data":"9ea39b543d04eebc7b8a681d8ef95ef4f3f1c135cdaa94bfced0abab701b7501"} Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.508881 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" event={"ID":"af2859ca-d690-49fb-8dd6-2b50441aa577","Type":"ContainerStarted","Data":"6efc1c43f7a68d92c2133143c9638fb3c8d859955320e27052f7e1bd94529ca2"} Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.510794 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7c2ac1cc-c49c-4966-a357-2d1ba04d5671","Type":"ContainerStarted","Data":"29865636d02eed7e3a231cad9fbaab281f3a223fc617697aa5f39860eac2c54c"} Oct 01 13:10:32 crc kubenswrapper[4851]: E1001 13:10:32.512271 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/podified-master-centos10/openstack-mariadb:watcher_latest\\\"\"" pod="openstack/openstack-galera-0" podUID="7c2ac1cc-c49c-4966-a357-2d1ba04d5671" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.513579 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57945c477-2qxhb" event={"ID":"513e329d-81d9-448b-be02-43bf41b95f09","Type":"ContainerStarted","Data":"4487661d754e7f9bdba9f56e22edc2e268af878441b2da11fe8b8ab54644b631"} Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.515523 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.515519 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cbcfdc8f-v52m6" event={"ID":"0db2d45b-2535-4e9b-8109-b3c8bde34391","Type":"ContainerDied","Data":"da146c97d540f5be48ad583ca6d38b2ebf3622581970145c0e54c2b89ea173e8"} Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.517003 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"712c3704-a775-4fac-81d6-9aa9cfdc48ef","Type":"ContainerStarted","Data":"8214e828bf9c3d248a79ae102512b98df8b697071f7e04edd37199923d3f58dd"} Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.519632 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"95c53639-696f-4d10-a297-7173dd3b394f","Type":"ContainerStarted","Data":"c2a1ac02edab40d1ace222280655edd43af85e0df391ea83b621a64b5849bc71"} Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.523690 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f","Type":"ContainerStarted","Data":"1260d5a1b699797a6edd45cd5fcbbabae686ab52b6b7e6eb773d4c085ae5e928"} Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.525559 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3739abde-3ff0-4c31-aeb6-f731eb37dfac","Type":"ContainerStarted","Data":"92a0a450c899e1346fc6fe8118bf3a0813bc629b368af2e70c851b5cb29b2b3b"} Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.526987 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-749b584b69-tjnss" event={"ID":"31bf58ed-cbe7-48da-a24a-677a859e487f","Type":"ContainerStarted","Data":"1c1f13667bb3873c864fe744e6ae157d9092d695ad95f07d8a5482b03e91b739"} Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.700725 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dd886f455-5sn9b"] Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.706173 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dd886f455-5sn9b"] Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.763594 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77cbcfdc8f-v52m6"] Oct 01 13:10:32 crc kubenswrapper[4851]: I1001 13:10:32.768190 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77cbcfdc8f-v52m6"] Oct 01 13:10:33 crc kubenswrapper[4851]: E1001 13:10:33.542963 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/podified-master-centos10/openstack-ovn-base:watcher_latest\\\"\"" pod="openstack/ovn-controller-ovs-gt7fw" podUID="928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22" Oct 01 13:10:33 crc kubenswrapper[4851]: E1001 13:10:33.543434 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/podified-master-centos10/openstack-mariadb:watcher_latest\\\"\"" pod="openstack/openstack-galera-0" podUID="7c2ac1cc-c49c-4966-a357-2d1ba04d5671" Oct 01 13:10:33 crc kubenswrapper[4851]: E1001 13:10:33.544996 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-config-reloader\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:8597c48fc71fc6ec8e87dbe40dace4dbb7b817c1039db608af76a0d90f7ac2d0\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" Oct 01 13:10:34 crc kubenswrapper[4851]: I1001 13:10:34.341417 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="035372ef-7ca6-4afa-bcd7-1e56f5a79a0f" path="/var/lib/kubelet/pods/035372ef-7ca6-4afa-bcd7-1e56f5a79a0f/volumes" Oct 01 13:10:34 crc kubenswrapper[4851]: I1001 13:10:34.342049 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0db2d45b-2535-4e9b-8109-b3c8bde34391" path="/var/lib/kubelet/pods/0db2d45b-2535-4e9b-8109-b3c8bde34391/volumes" Oct 01 13:10:45 crc kubenswrapper[4851]: E1001 13:10:45.052300 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest" Oct 01 13:10:45 crc kubenswrapper[4851]: E1001 13:10:45.053083 4851 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest" Oct 01 13:10:45 crc kubenswrapper[4851]: E1001 13:10:45.053308 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:38.102.83.36:5001/podified-master-centos10/openstack-rabbitmq:watcher_latest,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lwltf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-notifications-server-0_openstack(8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:10:45 crc kubenswrapper[4851]: E1001 13:10:45.054561 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-notifications-server-0" podUID="8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb" Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.723747 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7c2ac1cc-c49c-4966-a357-2d1ba04d5671","Type":"ContainerStarted","Data":"54b6c69bc0267f719276f176daf366317beed2ce787c890d0c4d96b44a5950cc"} Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.729196 4851 generic.go:334] "Generic (PLEG): container finished" podID="513e329d-81d9-448b-be02-43bf41b95f09" containerID="63cbe7c7b8de78d4e986d60e321b3deb9b259065e64592c261935951e57ecc35" exitCode=0 Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.729288 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57945c477-2qxhb" event={"ID":"513e329d-81d9-448b-be02-43bf41b95f09","Type":"ContainerDied","Data":"63cbe7c7b8de78d4e986d60e321b3deb9b259065e64592c261935951e57ecc35"} Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.732822 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f","Type":"ContainerStarted","Data":"f16f76bdc61bdb94da8db0c72b6dc0f94028afe599af8c6c9534057d9cf210b4"} Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.734800 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3739abde-3ff0-4c31-aeb6-f731eb37dfac","Type":"ContainerStarted","Data":"e7f39214b648fb40bb92f0e39154befa156fa3e853790e96bbac7ffb1479bc19"} Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.736654 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"705d247f-3afa-49f5-ba1d-ab991af3e399","Type":"ContainerStarted","Data":"1489eeaed206b0afd2361a155b016c59a95578d9f264809db4bcf05d41e44085"} Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.737414 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.739259 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5dbbdd15-a094-4982-a2f1-370f00a1b004","Type":"ContainerStarted","Data":"24fb2f570ee3764bef35b360f96d4fee8a188aee286c938cb5c8e7a885e0f12c"} Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.739852 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.744734 4851 generic.go:334] "Generic (PLEG): container finished" podID="31bf58ed-cbe7-48da-a24a-677a859e487f" containerID="c7968182c7b0cce047164755e15a05a003d56b37e39f0e3b9bdc86983cbcb1d9" exitCode=0 Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.744982 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-749b584b69-tjnss" event={"ID":"31bf58ed-cbe7-48da-a24a-677a859e487f","Type":"ContainerDied","Data":"c7968182c7b0cce047164755e15a05a003d56b37e39f0e3b9bdc86983cbcb1d9"} Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.750030 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"311e8f50-9e4a-4a03-bc24-04b76d53a238","Type":"ContainerStarted","Data":"cd0a793e96754f5875c83b061c11ac6fd4fe2c4ab25012a88d6c5610311fd3d6"} Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.753654 4851 generic.go:334] "Generic (PLEG): container finished" podID="af2859ca-d690-49fb-8dd6-2b50441aa577" containerID="4ae053a22d1220c462d4dc1dc6d796cd3bdc4f6fa977347571ead9987a93889e" exitCode=0 Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.753696 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" event={"ID":"af2859ca-d690-49fb-8dd6-2b50441aa577","Type":"ContainerDied","Data":"4ae053a22d1220c462d4dc1dc6d796cd3bdc4f6fa977347571ead9987a93889e"} Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.784114 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=17.090542512 podStartE2EDuration="30.784098013s" podCreationTimestamp="2025-10-01 13:10:19 +0000 UTC" firstStartedPulling="2025-10-01 13:10:32.116096338 +0000 UTC m=+1040.461213824" lastFinishedPulling="2025-10-01 13:10:45.809651809 +0000 UTC m=+1054.154769325" observedRunningTime="2025-10-01 13:10:49.762427207 +0000 UTC m=+1058.107544703" watchObservedRunningTime="2025-10-01 13:10:49.784098013 +0000 UTC m=+1058.129215489" Oct 01 13:10:49 crc kubenswrapper[4851]: I1001 13:10:49.787429 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.525371421 podStartE2EDuration="28.787421347s" podCreationTimestamp="2025-10-01 13:10:21 +0000 UTC" firstStartedPulling="2025-10-01 13:10:32.098617901 +0000 UTC m=+1040.443735397" lastFinishedPulling="2025-10-01 13:10:49.360667837 +0000 UTC m=+1057.705785323" observedRunningTime="2025-10-01 13:10:49.777535276 +0000 UTC m=+1058.122652762" watchObservedRunningTime="2025-10-01 13:10:49.787421347 +0000 UTC m=+1058.132538833" Oct 01 13:10:50 crc kubenswrapper[4851]: I1001 13:10:50.764265 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-749b584b69-tjnss" event={"ID":"31bf58ed-cbe7-48da-a24a-677a859e487f","Type":"ContainerStarted","Data":"a24471a927924d4ecc55b73d18060f9f9c53feafaed20fd8923c37e555eda895"} Oct 01 13:10:50 crc kubenswrapper[4851]: I1001 13:10:50.764903 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:50 crc kubenswrapper[4851]: I1001 13:10:50.767476 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" event={"ID":"af2859ca-d690-49fb-8dd6-2b50441aa577","Type":"ContainerStarted","Data":"d7715ee4abd20db674ef93a19e0cb5be83e147c9ecfc904a30f8f134ebe9bf01"} Oct 01 13:10:50 crc kubenswrapper[4851]: I1001 13:10:50.767659 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:10:50 crc kubenswrapper[4851]: I1001 13:10:50.769606 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gt7fw" event={"ID":"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22","Type":"ContainerStarted","Data":"83451929d32e355e678d35d6bdb921ffbae5e2bec39e7d5b39c890955a106217"} Oct 01 13:10:50 crc kubenswrapper[4851]: I1001 13:10:50.772191 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57945c477-2qxhb" event={"ID":"513e329d-81d9-448b-be02-43bf41b95f09","Type":"ContainerDied","Data":"4487661d754e7f9bdba9f56e22edc2e268af878441b2da11fe8b8ab54644b631"} Oct 01 13:10:50 crc kubenswrapper[4851]: I1001 13:10:50.772216 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4487661d754e7f9bdba9f56e22edc2e268af878441b2da11fe8b8ab54644b631" Oct 01 13:10:50 crc kubenswrapper[4851]: I1001 13:10:50.775021 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dmq5k" event={"ID":"7b15e5b7-162e-46ee-a292-bf763704cda6","Type":"ContainerStarted","Data":"b40ef71f5a410aa21743d14f2abf5e964582f1dcdd6e3f94ed7867f9188c674b"} Oct 01 13:10:50 crc kubenswrapper[4851]: I1001 13:10:50.775056 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-dmq5k" Oct 01 13:10:50 crc kubenswrapper[4851]: I1001 13:10:50.786713 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-749b584b69-tjnss" podStartSLOduration=23.850482532 podStartE2EDuration="36.786667974s" podCreationTimestamp="2025-10-01 13:10:14 +0000 UTC" firstStartedPulling="2025-10-01 13:10:32.133626455 +0000 UTC m=+1040.478743931" lastFinishedPulling="2025-10-01 13:10:45.069811857 +0000 UTC m=+1053.414929373" observedRunningTime="2025-10-01 13:10:50.784341788 +0000 UTC m=+1059.129459294" watchObservedRunningTime="2025-10-01 13:10:50.786667974 +0000 UTC m=+1059.131785470" Oct 01 13:10:50 crc kubenswrapper[4851]: I1001 13:10:50.811152 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" podStartSLOduration=22.102685255 podStartE2EDuration="36.811133049s" podCreationTimestamp="2025-10-01 13:10:14 +0000 UTC" firstStartedPulling="2025-10-01 13:10:32.09680879 +0000 UTC m=+1040.441926276" lastFinishedPulling="2025-10-01 13:10:46.805256574 +0000 UTC m=+1055.150374070" observedRunningTime="2025-10-01 13:10:50.803881433 +0000 UTC m=+1059.148998919" watchObservedRunningTime="2025-10-01 13:10:50.811133049 +0000 UTC m=+1059.156250535" Oct 01 13:10:50 crc kubenswrapper[4851]: I1001 13:10:50.829305 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dmq5k" podStartSLOduration=11.837912616 podStartE2EDuration="26.829288325s" podCreationTimestamp="2025-10-01 13:10:24 +0000 UTC" firstStartedPulling="2025-10-01 13:10:32.089679477 +0000 UTC m=+1040.434796963" lastFinishedPulling="2025-10-01 13:10:47.081055156 +0000 UTC m=+1055.426172672" observedRunningTime="2025-10-01 13:10:50.826012742 +0000 UTC m=+1059.171130238" watchObservedRunningTime="2025-10-01 13:10:50.829288325 +0000 UTC m=+1059.174405811" Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.019237 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57945c477-2qxhb" Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.141833 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/513e329d-81d9-448b-be02-43bf41b95f09-dns-svc\") pod \"513e329d-81d9-448b-be02-43bf41b95f09\" (UID: \"513e329d-81d9-448b-be02-43bf41b95f09\") " Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.141944 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsnzw\" (UniqueName: \"kubernetes.io/projected/513e329d-81d9-448b-be02-43bf41b95f09-kube-api-access-tsnzw\") pod \"513e329d-81d9-448b-be02-43bf41b95f09\" (UID: \"513e329d-81d9-448b-be02-43bf41b95f09\") " Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.142043 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513e329d-81d9-448b-be02-43bf41b95f09-config\") pod \"513e329d-81d9-448b-be02-43bf41b95f09\" (UID: \"513e329d-81d9-448b-be02-43bf41b95f09\") " Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.191253 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/513e329d-81d9-448b-be02-43bf41b95f09-kube-api-access-tsnzw" (OuterVolumeSpecName: "kube-api-access-tsnzw") pod "513e329d-81d9-448b-be02-43bf41b95f09" (UID: "513e329d-81d9-448b-be02-43bf41b95f09"). InnerVolumeSpecName "kube-api-access-tsnzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.244031 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsnzw\" (UniqueName: \"kubernetes.io/projected/513e329d-81d9-448b-be02-43bf41b95f09-kube-api-access-tsnzw\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.293902 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/513e329d-81d9-448b-be02-43bf41b95f09-config" (OuterVolumeSpecName: "config") pod "513e329d-81d9-448b-be02-43bf41b95f09" (UID: "513e329d-81d9-448b-be02-43bf41b95f09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.326954 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/513e329d-81d9-448b-be02-43bf41b95f09-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "513e329d-81d9-448b-be02-43bf41b95f09" (UID: "513e329d-81d9-448b-be02-43bf41b95f09"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.345959 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/513e329d-81d9-448b-be02-43bf41b95f09-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.345995 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513e329d-81d9-448b-be02-43bf41b95f09-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.793710 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb","Type":"ContainerStarted","Data":"614a0e65fe50b13ce4e61379c18ac00726f1b2b0fa03c62cd2708f05b65907e3"} Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.797126 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"95c53639-696f-4d10-a297-7173dd3b394f","Type":"ContainerStarted","Data":"83ea666e82ffc032b51fb419fe2f3e2cbf43bf29cd27389317204ca318397904"} Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.805060 4851 generic.go:334] "Generic (PLEG): container finished" podID="928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22" containerID="83451929d32e355e678d35d6bdb921ffbae5e2bec39e7d5b39c890955a106217" exitCode=0 Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.805206 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gt7fw" event={"ID":"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22","Type":"ContainerDied","Data":"83451929d32e355e678d35d6bdb921ffbae5e2bec39e7d5b39c890955a106217"} Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.809819 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57945c477-2qxhb" Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.811423 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"712c3704-a775-4fac-81d6-9aa9cfdc48ef","Type":"ContainerStarted","Data":"413d60ea21361b7065a174a88bcb5289e9879147b33e039ab6917e97367daef4"} Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.948878 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57945c477-2qxhb"] Oct 01 13:10:51 crc kubenswrapper[4851]: I1001 13:10:51.964582 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57945c477-2qxhb"] Oct 01 13:10:52 crc kubenswrapper[4851]: I1001 13:10:52.341029 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="513e329d-81d9-448b-be02-43bf41b95f09" path="/var/lib/kubelet/pods/513e329d-81d9-448b-be02-43bf41b95f09/volumes" Oct 01 13:10:52 crc kubenswrapper[4851]: I1001 13:10:52.821097 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"486b3f7c-3593-4950-b470-2d0a2f037e2f","Type":"ContainerStarted","Data":"e23df47d3fab5210b11bee1ab4f89352721626bac74ccd2d9dd929430f568032"} Oct 01 13:10:55 crc kubenswrapper[4851]: I1001 13:10:55.079720 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 01 13:10:55 crc kubenswrapper[4851]: I1001 13:10:55.269756 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:10:55 crc kubenswrapper[4851]: I1001 13:10:55.328460 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-749b584b69-tjnss"] Oct 01 13:10:55 crc kubenswrapper[4851]: I1001 13:10:55.329028 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-749b584b69-tjnss" podUID="31bf58ed-cbe7-48da-a24a-677a859e487f" containerName="dnsmasq-dns" containerID="cri-o://a24471a927924d4ecc55b73d18060f9f9c53feafaed20fd8923c37e555eda895" gracePeriod=10 Oct 01 13:10:55 crc kubenswrapper[4851]: I1001 13:10:55.330752 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:55 crc kubenswrapper[4851]: I1001 13:10:55.847704 4851 generic.go:334] "Generic (PLEG): container finished" podID="31bf58ed-cbe7-48da-a24a-677a859e487f" containerID="a24471a927924d4ecc55b73d18060f9f9c53feafaed20fd8923c37e555eda895" exitCode=0 Oct 01 13:10:55 crc kubenswrapper[4851]: I1001 13:10:55.847803 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-749b584b69-tjnss" event={"ID":"31bf58ed-cbe7-48da-a24a-677a859e487f","Type":"ContainerDied","Data":"a24471a927924d4ecc55b73d18060f9f9c53feafaed20fd8923c37e555eda895"} Oct 01 13:10:55 crc kubenswrapper[4851]: I1001 13:10:55.850000 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gt7fw" event={"ID":"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22","Type":"ContainerStarted","Data":"87b2a2a2c5f0b9bf7cad211e99f0fecc14cca55f49ac408761868793283f3a47"} Oct 01 13:10:56 crc kubenswrapper[4851]: I1001 13:10:56.863029 4851 generic.go:334] "Generic (PLEG): container finished" podID="311e8f50-9e4a-4a03-bc24-04b76d53a238" containerID="cd0a793e96754f5875c83b061c11ac6fd4fe2c4ab25012a88d6c5610311fd3d6" exitCode=0 Oct 01 13:10:56 crc kubenswrapper[4851]: I1001 13:10:56.863084 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"311e8f50-9e4a-4a03-bc24-04b76d53a238","Type":"ContainerDied","Data":"cd0a793e96754f5875c83b061c11ac6fd4fe2c4ab25012a88d6c5610311fd3d6"} Oct 01 13:10:57 crc kubenswrapper[4851]: I1001 13:10:57.880967 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gt7fw" event={"ID":"928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22","Type":"ContainerStarted","Data":"a838fbd33556c27c00be0cb00075417e491d863c033ce27870d074072fe370d2"} Oct 01 13:10:57 crc kubenswrapper[4851]: I1001 13:10:57.959035 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.061941 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnkmg\" (UniqueName: \"kubernetes.io/projected/31bf58ed-cbe7-48da-a24a-677a859e487f-kube-api-access-bnkmg\") pod \"31bf58ed-cbe7-48da-a24a-677a859e487f\" (UID: \"31bf58ed-cbe7-48da-a24a-677a859e487f\") " Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.061998 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31bf58ed-cbe7-48da-a24a-677a859e487f-config\") pod \"31bf58ed-cbe7-48da-a24a-677a859e487f\" (UID: \"31bf58ed-cbe7-48da-a24a-677a859e487f\") " Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.062044 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31bf58ed-cbe7-48da-a24a-677a859e487f-dns-svc\") pod \"31bf58ed-cbe7-48da-a24a-677a859e487f\" (UID: \"31bf58ed-cbe7-48da-a24a-677a859e487f\") " Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.072728 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bf58ed-cbe7-48da-a24a-677a859e487f-kube-api-access-bnkmg" (OuterVolumeSpecName: "kube-api-access-bnkmg") pod "31bf58ed-cbe7-48da-a24a-677a859e487f" (UID: "31bf58ed-cbe7-48da-a24a-677a859e487f"). InnerVolumeSpecName "kube-api-access-bnkmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.102365 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31bf58ed-cbe7-48da-a24a-677a859e487f-config" (OuterVolumeSpecName: "config") pod "31bf58ed-cbe7-48da-a24a-677a859e487f" (UID: "31bf58ed-cbe7-48da-a24a-677a859e487f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.107694 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31bf58ed-cbe7-48da-a24a-677a859e487f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31bf58ed-cbe7-48da-a24a-677a859e487f" (UID: "31bf58ed-cbe7-48da-a24a-677a859e487f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.164673 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnkmg\" (UniqueName: \"kubernetes.io/projected/31bf58ed-cbe7-48da-a24a-677a859e487f-kube-api-access-bnkmg\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.164732 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31bf58ed-cbe7-48da-a24a-677a859e487f-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.164751 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31bf58ed-cbe7-48da-a24a-677a859e487f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.900295 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3739abde-3ff0-4c31-aeb6-f731eb37dfac","Type":"ContainerStarted","Data":"1340c0bc2d3ccd0ef1bb70f0c0fc427b53b0d05dc6ca44fc6207964a14f062dc"} Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.905225 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-749b584b69-tjnss" event={"ID":"31bf58ed-cbe7-48da-a24a-677a859e487f","Type":"ContainerDied","Data":"1c1f13667bb3873c864fe744e6ae157d9092d695ad95f07d8a5482b03e91b739"} Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.905277 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749b584b69-tjnss" Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.905314 4851 scope.go:117] "RemoveContainer" containerID="a24471a927924d4ecc55b73d18060f9f9c53feafaed20fd8923c37e555eda895" Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.910985 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"311e8f50-9e4a-4a03-bc24-04b76d53a238","Type":"ContainerStarted","Data":"f38c63b4a16f44e14506f7943bf88a298bb76be89160791fb3c8ba0b76007016"} Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.914566 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f","Type":"ContainerStarted","Data":"60fbfa094dcf764228a8ea496a1a136433ed91c1ccbad1e55707fc0f68be9d05"} Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.916887 4851 generic.go:334] "Generic (PLEG): container finished" podID="7c2ac1cc-c49c-4966-a357-2d1ba04d5671" containerID="54b6c69bc0267f719276f176daf366317beed2ce787c890d0c4d96b44a5950cc" exitCode=0 Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.916980 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7c2ac1cc-c49c-4966-a357-2d1ba04d5671","Type":"ContainerDied","Data":"54b6c69bc0267f719276f176daf366317beed2ce787c890d0c4d96b44a5950cc"} Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.918370 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.933920 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.089935044 podStartE2EDuration="31.933894784s" podCreationTimestamp="2025-10-01 13:10:27 +0000 UTC" firstStartedPulling="2025-10-01 13:10:32.333353087 +0000 UTC m=+1040.678470593" lastFinishedPulling="2025-10-01 13:10:58.177312837 +0000 UTC m=+1066.522430333" observedRunningTime="2025-10-01 13:10:58.933147562 +0000 UTC m=+1067.278265118" watchObservedRunningTime="2025-10-01 13:10:58.933894784 +0000 UTC m=+1067.279012310" Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.948384 4851 scope.go:117] "RemoveContainer" containerID="c7968182c7b0cce047164755e15a05a003d56b37e39f0e3b9bdc86983cbcb1d9" Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.975264 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-749b584b69-tjnss"] Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.986214 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-749b584b69-tjnss"] Oct 01 13:10:58 crc kubenswrapper[4851]: I1001 13:10:58.989377 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-gt7fw" podStartSLOduration=17.855432791 podStartE2EDuration="34.989353269s" podCreationTimestamp="2025-10-01 13:10:24 +0000 UTC" firstStartedPulling="2025-10-01 13:10:32.253827438 +0000 UTC m=+1040.598944924" lastFinishedPulling="2025-10-01 13:10:49.387747916 +0000 UTC m=+1057.732865402" observedRunningTime="2025-10-01 13:10:58.98305842 +0000 UTC m=+1067.328175936" watchObservedRunningTime="2025-10-01 13:10:58.989353269 +0000 UTC m=+1067.334470765" Oct 01 13:10:59 crc kubenswrapper[4851]: I1001 13:10:59.027738 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.871804914 podStartE2EDuration="41.027708518s" podCreationTimestamp="2025-10-01 13:10:18 +0000 UTC" firstStartedPulling="2025-10-01 13:10:32.198481306 +0000 UTC m=+1040.543598792" lastFinishedPulling="2025-10-01 13:10:46.35438489 +0000 UTC m=+1054.699502396" observedRunningTime="2025-10-01 13:10:59.02322186 +0000 UTC m=+1067.368339416" watchObservedRunningTime="2025-10-01 13:10:59.027708518 +0000 UTC m=+1067.372826044" Oct 01 13:10:59 crc kubenswrapper[4851]: I1001 13:10:59.050132 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.012546586 podStartE2EDuration="34.050116864s" podCreationTimestamp="2025-10-01 13:10:25 +0000 UTC" firstStartedPulling="2025-10-01 13:10:32.162227897 +0000 UTC m=+1040.507345383" lastFinishedPulling="2025-10-01 13:10:58.199798155 +0000 UTC m=+1066.544915661" observedRunningTime="2025-10-01 13:10:59.047429218 +0000 UTC m=+1067.392546704" watchObservedRunningTime="2025-10-01 13:10:59.050116864 +0000 UTC m=+1067.395234350" Oct 01 13:10:59 crc kubenswrapper[4851]: I1001 13:10:59.201116 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:59 crc kubenswrapper[4851]: I1001 13:10:59.201155 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:59 crc kubenswrapper[4851]: I1001 13:10:59.249963 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 01 13:10:59 crc kubenswrapper[4851]: I1001 13:10:59.451462 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:59 crc kubenswrapper[4851]: I1001 13:10:59.452200 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 01 13:10:59 crc kubenswrapper[4851]: I1001 13:10:59.565315 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:59 crc kubenswrapper[4851]: I1001 13:10:59.621758 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:59 crc kubenswrapper[4851]: I1001 13:10:59.943912 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7c2ac1cc-c49c-4966-a357-2d1ba04d5671","Type":"ContainerStarted","Data":"23614322449bef74b897d0b85dd38553ec8689feffe87fcc93f075cef1e26dc6"} Oct 01 13:10:59 crc kubenswrapper[4851]: I1001 13:10:59.945026 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 01 13:10:59 crc kubenswrapper[4851]: I1001 13:10:59.945962 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:10:59 crc kubenswrapper[4851]: I1001 13:10:59.984331 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.13182532 podStartE2EDuration="41.984262204s" podCreationTimestamp="2025-10-01 13:10:18 +0000 UTC" firstStartedPulling="2025-10-01 13:10:32.228616132 +0000 UTC m=+1040.573733618" lastFinishedPulling="2025-10-01 13:10:47.081052976 +0000 UTC m=+1055.426170502" observedRunningTime="2025-10-01 13:10:59.979621932 +0000 UTC m=+1068.324739498" watchObservedRunningTime="2025-10-01 13:10:59.984262204 +0000 UTC m=+1068.329379730" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.017188 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.025270 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.313680 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fb4b9554f-tswhd"] Oct 01 13:11:00 crc kubenswrapper[4851]: E1001 13:11:00.314034 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31bf58ed-cbe7-48da-a24a-677a859e487f" containerName="dnsmasq-dns" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.314049 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bf58ed-cbe7-48da-a24a-677a859e487f" containerName="dnsmasq-dns" Oct 01 13:11:00 crc kubenswrapper[4851]: E1001 13:11:00.314068 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31bf58ed-cbe7-48da-a24a-677a859e487f" containerName="init" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.314075 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bf58ed-cbe7-48da-a24a-677a859e487f" containerName="init" Oct 01 13:11:00 crc kubenswrapper[4851]: E1001 13:11:00.314095 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513e329d-81d9-448b-be02-43bf41b95f09" containerName="init" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.314101 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="513e329d-81d9-448b-be02-43bf41b95f09" containerName="init" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.314261 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="513e329d-81d9-448b-be02-43bf41b95f09" containerName="init" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.314272 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="31bf58ed-cbe7-48da-a24a-677a859e487f" containerName="dnsmasq-dns" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.315131 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.319559 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.326831 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb4b9554f-tswhd"] Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.342961 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31bf58ed-cbe7-48da-a24a-677a859e487f" path="/var/lib/kubelet/pods/31bf58ed-cbe7-48da-a24a-677a859e487f/volumes" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.376827 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8sm5d"] Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.377998 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.385366 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.387620 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8sm5d"] Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.405906 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7drrv\" (UniqueName: \"kubernetes.io/projected/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-kube-api-access-7drrv\") pod \"dnsmasq-dns-fb4b9554f-tswhd\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.405953 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-ovsdbserver-nb\") pod \"dnsmasq-dns-fb4b9554f-tswhd\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.406031 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-dns-svc\") pod \"dnsmasq-dns-fb4b9554f-tswhd\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.406239 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-config\") pod \"dnsmasq-dns-fb4b9554f-tswhd\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.510295 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb4b9554f-tswhd"] Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.516359 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4nvg\" (UniqueName: \"kubernetes.io/projected/236d88b9-092a-448d-a793-b133f7abe5f9-kube-api-access-l4nvg\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.516429 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-dns-svc\") pod \"dnsmasq-dns-fb4b9554f-tswhd\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.516455 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236d88b9-092a-448d-a793-b133f7abe5f9-config\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.516480 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-config\") pod \"dnsmasq-dns-fb4b9554f-tswhd\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.516514 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/236d88b9-092a-448d-a793-b133f7abe5f9-ovs-rundir\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.516529 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/236d88b9-092a-448d-a793-b133f7abe5f9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.516555 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/236d88b9-092a-448d-a793-b133f7abe5f9-ovn-rundir\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.516580 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236d88b9-092a-448d-a793-b133f7abe5f9-combined-ca-bundle\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.516616 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7drrv\" (UniqueName: \"kubernetes.io/projected/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-kube-api-access-7drrv\") pod \"dnsmasq-dns-fb4b9554f-tswhd\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.516634 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-ovsdbserver-nb\") pod \"dnsmasq-dns-fb4b9554f-tswhd\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.517460 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-ovsdbserver-nb\") pod \"dnsmasq-dns-fb4b9554f-tswhd\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:00 crc kubenswrapper[4851]: E1001 13:11:00.517702 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-7drrv ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" podUID="ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.518422 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-dns-svc\") pod \"dnsmasq-dns-fb4b9554f-tswhd\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.518816 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-config\") pod \"dnsmasq-dns-fb4b9554f-tswhd\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.530448 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.532037 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.536637 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.536784 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.537221 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-t5zmp" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.537424 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.544374 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7drrv\" (UniqueName: \"kubernetes.io/projected/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-kube-api-access-7drrv\") pod \"dnsmasq-dns-fb4b9554f-tswhd\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.554268 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.571031 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89495b55c-rrgzf"] Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.578984 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.583874 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.609266 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89495b55c-rrgzf"] Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.618534 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/236d88b9-092a-448d-a793-b133f7abe5f9-ovs-rundir\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.618581 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/236d88b9-092a-448d-a793-b133f7abe5f9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.618611 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.618646 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/236d88b9-092a-448d-a793-b133f7abe5f9-ovn-rundir\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.618679 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6cgb\" (UniqueName: \"kubernetes.io/projected/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-kube-api-access-p6cgb\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.618701 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236d88b9-092a-448d-a793-b133f7abe5f9-combined-ca-bundle\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.618757 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-scripts\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.618779 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.618816 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-config\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.618845 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4nvg\" (UniqueName: \"kubernetes.io/projected/236d88b9-092a-448d-a793-b133f7abe5f9-kube-api-access-l4nvg\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.618876 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.618906 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.618940 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236d88b9-092a-448d-a793-b133f7abe5f9-config\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.619027 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/236d88b9-092a-448d-a793-b133f7abe5f9-ovs-rundir\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.619108 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/236d88b9-092a-448d-a793-b133f7abe5f9-ovn-rundir\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.619741 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236d88b9-092a-448d-a793-b133f7abe5f9-config\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.622177 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/236d88b9-092a-448d-a793-b133f7abe5f9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.638017 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236d88b9-092a-448d-a793-b133f7abe5f9-combined-ca-bundle\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.641069 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4nvg\" (UniqueName: \"kubernetes.io/projected/236d88b9-092a-448d-a793-b133f7abe5f9-kube-api-access-l4nvg\") pod \"ovn-controller-metrics-8sm5d\" (UID: \"236d88b9-092a-448d-a793-b133f7abe5f9\") " pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.702178 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8sm5d" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.721856 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.721911 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6cgb\" (UniqueName: \"kubernetes.io/projected/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-kube-api-access-p6cgb\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.721957 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-scripts\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.721978 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.722002 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-dns-svc\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.722030 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-config\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.722061 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.722086 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.722112 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-ovsdbserver-sb\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.722151 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-config\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.722166 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-ovsdbserver-nb\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.722183 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z5f8\" (UniqueName: \"kubernetes.io/projected/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-kube-api-access-8z5f8\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.723136 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.723410 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-scripts\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.724849 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-config\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.725682 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.726221 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.739076 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6cgb\" (UniqueName: \"kubernetes.io/projected/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-kube-api-access-p6cgb\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.740629 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b6f03e-998a-46c8-a0d7-ce0cbad79b3b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b\") " pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.823767 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-dns-svc\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.823869 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-ovsdbserver-sb\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.823898 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-config\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.823925 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-ovsdbserver-nb\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.823950 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z5f8\" (UniqueName: \"kubernetes.io/projected/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-kube-api-access-8z5f8\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.824594 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-dns-svc\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.825205 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-ovsdbserver-sb\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.825793 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-ovsdbserver-nb\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.827149 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-config\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.844358 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z5f8\" (UniqueName: \"kubernetes.io/projected/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-kube-api-access-8z5f8\") pod \"dnsmasq-dns-89495b55c-rrgzf\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.906670 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.918037 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.971720 4851 generic.go:334] "Generic (PLEG): container finished" podID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerID="e23df47d3fab5210b11bee1ab4f89352721626bac74ccd2d9dd929430f568032" exitCode=0 Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.972638 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"486b3f7c-3593-4950-b470-2d0a2f037e2f","Type":"ContainerDied","Data":"e23df47d3fab5210b11bee1ab4f89352721626bac74ccd2d9dd929430f568032"} Oct 01 13:11:00 crc kubenswrapper[4851]: I1001 13:11:00.972951 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.037186 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.128955 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-dns-svc\") pod \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.129000 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-config\") pod \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.129212 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-ovsdbserver-nb\") pod \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.129267 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7drrv\" (UniqueName: \"kubernetes.io/projected/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-kube-api-access-7drrv\") pod \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\" (UID: \"ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e\") " Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.129362 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e" (UID: "ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.129581 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-config" (OuterVolumeSpecName: "config") pod "ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e" (UID: "ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.130021 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.130046 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.130723 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e" (UID: "ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.133958 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-kube-api-access-7drrv" (OuterVolumeSpecName: "kube-api-access-7drrv") pod "ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e" (UID: "ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e"). InnerVolumeSpecName "kube-api-access-7drrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.169263 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8sm5d"] Oct 01 13:11:01 crc kubenswrapper[4851]: W1001 13:11:01.180422 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod236d88b9_092a_448d_a793_b133f7abe5f9.slice/crio-27c36ab93dae95259ccdd4bcf95f3c92798e9b6ca55d68e251bf60dc6bcb650c WatchSource:0}: Error finding container 27c36ab93dae95259ccdd4bcf95f3c92798e9b6ca55d68e251bf60dc6bcb650c: Status 404 returned error can't find the container with id 27c36ab93dae95259ccdd4bcf95f3c92798e9b6ca55d68e251bf60dc6bcb650c Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.231269 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.231556 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7drrv\" (UniqueName: \"kubernetes.io/projected/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e-kube-api-access-7drrv\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.436751 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.453804 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89495b55c-rrgzf"] Oct 01 13:11:01 crc kubenswrapper[4851]: W1001 13:11:01.456341 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1cb0c4a_b8d1_497e_ab96_79da3dc9b255.slice/crio-c9d3b01449270a87b3ea274c86f62119acbe8fedc2ab476383f819369cc2c21d WatchSource:0}: Error finding container c9d3b01449270a87b3ea274c86f62119acbe8fedc2ab476383f819369cc2c21d: Status 404 returned error can't find the container with id c9d3b01449270a87b3ea274c86f62119acbe8fedc2ab476383f819369cc2c21d Oct 01 13:11:01 crc kubenswrapper[4851]: W1001 13:11:01.457188 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59b6f03e_998a_46c8_a0d7_ce0cbad79b3b.slice/crio-db4196a0e145a5b09ff0ee2e81f738aa0b41c9294c474d6e120f50ae10a57ddb WatchSource:0}: Error finding container db4196a0e145a5b09ff0ee2e81f738aa0b41c9294c474d6e120f50ae10a57ddb: Status 404 returned error can't find the container with id db4196a0e145a5b09ff0ee2e81f738aa0b41c9294c474d6e120f50ae10a57ddb Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.862494 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89495b55c-rrgzf"] Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.864809 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.958304 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c66b849c-m2rtt"] Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.959687 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:01 crc kubenswrapper[4851]: I1001 13:11:01.993397 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c66b849c-m2rtt"] Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.012749 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b","Type":"ContainerStarted","Data":"db4196a0e145a5b09ff0ee2e81f738aa0b41c9294c474d6e120f50ae10a57ddb"} Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.050532 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-dns-svc\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.050637 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfs9f\" (UniqueName: \"kubernetes.io/projected/9b0bcde2-7523-41f2-bf95-17f4be1e55de-kube-api-access-tfs9f\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.050673 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-ovsdbserver-sb\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.050691 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-ovsdbserver-nb\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.050730 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-config\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.055834 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8sm5d" event={"ID":"236d88b9-092a-448d-a793-b133f7abe5f9","Type":"ContainerStarted","Data":"e498da7a91f4167a26570ddd5c069cef4f1a783bba89f221dbc602c12ecdeb08"} Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.055869 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8sm5d" event={"ID":"236d88b9-092a-448d-a793-b133f7abe5f9","Type":"ContainerStarted","Data":"27c36ab93dae95259ccdd4bcf95f3c92798e9b6ca55d68e251bf60dc6bcb650c"} Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.079334 4851 generic.go:334] "Generic (PLEG): container finished" podID="d1cb0c4a-b8d1-497e-ab96-79da3dc9b255" containerID="c76d592329eaca506b8549152f8dfbedf0f4f534de5a45414514c6db4a2663de" exitCode=0 Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.079427 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb4b9554f-tswhd" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.079426 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89495b55c-rrgzf" event={"ID":"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255","Type":"ContainerDied","Data":"c76d592329eaca506b8549152f8dfbedf0f4f534de5a45414514c6db4a2663de"} Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.079470 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89495b55c-rrgzf" event={"ID":"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255","Type":"ContainerStarted","Data":"c9d3b01449270a87b3ea274c86f62119acbe8fedc2ab476383f819369cc2c21d"} Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.093189 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8sm5d" podStartSLOduration=2.093169326 podStartE2EDuration="2.093169326s" podCreationTimestamp="2025-10-01 13:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:11:02.079402695 +0000 UTC m=+1070.424520181" watchObservedRunningTime="2025-10-01 13:11:02.093169326 +0000 UTC m=+1070.438286812" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.152343 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-config\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.152452 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-dns-svc\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.152604 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfs9f\" (UniqueName: \"kubernetes.io/projected/9b0bcde2-7523-41f2-bf95-17f4be1e55de-kube-api-access-tfs9f\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.152638 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-ovsdbserver-sb\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.152664 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-ovsdbserver-nb\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.162132 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-config\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.162952 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-dns-svc\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.164122 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-ovsdbserver-nb\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.202414 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfs9f\" (UniqueName: \"kubernetes.io/projected/9b0bcde2-7523-41f2-bf95-17f4be1e55de-kube-api-access-tfs9f\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.204878 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-ovsdbserver-sb\") pod \"dnsmasq-dns-5c66b849c-m2rtt\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.311562 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb4b9554f-tswhd"] Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.318963 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fb4b9554f-tswhd"] Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.319662 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.358438 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e" path="/var/lib/kubelet/pods/ee4b8fa9-1e09-47ed-b6b4-01a408ffb10e/volumes" Oct 01 13:11:02 crc kubenswrapper[4851]: I1001 13:11:02.870544 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c66b849c-m2rtt"] Oct 01 13:11:02 crc kubenswrapper[4851]: W1001 13:11:02.885439 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b0bcde2_7523_41f2_bf95_17f4be1e55de.slice/crio-51b11d27b64c378964b9a4634c28df627d4f09a6e84c624a99ad92e4c4070fbd WatchSource:0}: Error finding container 51b11d27b64c378964b9a4634c28df627d4f09a6e84c624a99ad92e4c4070fbd: Status 404 returned error can't find the container with id 51b11d27b64c378964b9a4634c28df627d4f09a6e84c624a99ad92e4c4070fbd Oct 01 13:11:02 crc kubenswrapper[4851]: E1001 13:11:02.896127 4851 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 01 13:11:02 crc kubenswrapper[4851]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 01 13:11:02 crc kubenswrapper[4851]: > podSandboxID="c9d3b01449270a87b3ea274c86f62119acbe8fedc2ab476383f819369cc2c21d" Oct 01 13:11:02 crc kubenswrapper[4851]: E1001 13:11:02.896297 4851 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 01 13:11:02 crc kubenswrapper[4851]: container &Container{Name:dnsmasq-dns,Image:38.102.83.36:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5dh565h686h689hc6h576h695h5b9h67bhf5h59dh595h55fh6ch548h547h688h655h59fh578h57bh594h568hb6h694h5c9hf4h549h6ch657h5b6h675q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8z5f8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-89495b55c-rrgzf_openstack(d1cb0c4a-b8d1-497e-ab96-79da3dc9b255): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 01 13:11:02 crc kubenswrapper[4851]: > logger="UnhandledError" Oct 01 13:11:02 crc kubenswrapper[4851]: E1001 13:11:02.897461 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-89495b55c-rrgzf" podUID="d1cb0c4a-b8d1-497e-ab96-79da3dc9b255" Oct 01 13:11:02 crc kubenswrapper[4851]: E1001 13:11:02.962146 4851 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.251:46326->38.102.83.251:42733: write tcp 38.102.83.251:46326->38.102.83.251:42733: write: broken pipe Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.088330 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" event={"ID":"9b0bcde2-7523-41f2-bf95-17f4be1e55de","Type":"ContainerStarted","Data":"51b11d27b64c378964b9a4634c28df627d4f09a6e84c624a99ad92e4c4070fbd"} Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.126483 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.139359 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.144736 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-p46t9" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.144783 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.145080 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.146100 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.149858 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.279073 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.279364 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r655f\" (UniqueName: \"kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-kube-api-access-r655f\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.279398 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6a4d7762-af01-4c7e-9641-4b2054a8885d-cache\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.279430 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6a4d7762-af01-4c7e-9641-4b2054a8885d-lock\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.279473 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.380631 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r655f\" (UniqueName: \"kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-kube-api-access-r655f\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.380684 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6a4d7762-af01-4c7e-9641-4b2054a8885d-cache\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.380711 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6a4d7762-af01-4c7e-9641-4b2054a8885d-lock\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.380743 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.380774 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: E1001 13:11:03.380920 4851 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 13:11:03 crc kubenswrapper[4851]: E1001 13:11:03.380938 4851 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 13:11:03 crc kubenswrapper[4851]: E1001 13:11:03.380983 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift podName:6a4d7762-af01-4c7e-9641-4b2054a8885d nodeName:}" failed. No retries permitted until 2025-10-01 13:11:03.880966149 +0000 UTC m=+1072.226083635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift") pod "swift-storage-0" (UID: "6a4d7762-af01-4c7e-9641-4b2054a8885d") : configmap "swift-ring-files" not found Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.381471 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.381578 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6a4d7762-af01-4c7e-9641-4b2054a8885d-cache\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.381585 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6a4d7762-af01-4c7e-9641-4b2054a8885d-lock\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.382192 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.426611 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r655f\" (UniqueName: \"kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-kube-api-access-r655f\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.431876 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.481234 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-ovsdbserver-sb\") pod \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.481278 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-config\") pod \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.481300 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-ovsdbserver-nb\") pod \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.481384 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z5f8\" (UniqueName: \"kubernetes.io/projected/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-kube-api-access-8z5f8\") pod \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.481465 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-dns-svc\") pod \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\" (UID: \"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255\") " Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.493716 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-kube-api-access-8z5f8" (OuterVolumeSpecName: "kube-api-access-8z5f8") pod "d1cb0c4a-b8d1-497e-ab96-79da3dc9b255" (UID: "d1cb0c4a-b8d1-497e-ab96-79da3dc9b255"). InnerVolumeSpecName "kube-api-access-8z5f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.534393 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d1cb0c4a-b8d1-497e-ab96-79da3dc9b255" (UID: "d1cb0c4a-b8d1-497e-ab96-79da3dc9b255"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.541081 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d1cb0c4a-b8d1-497e-ab96-79da3dc9b255" (UID: "d1cb0c4a-b8d1-497e-ab96-79da3dc9b255"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.541365 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d1cb0c4a-b8d1-497e-ab96-79da3dc9b255" (UID: "d1cb0c4a-b8d1-497e-ab96-79da3dc9b255"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.545207 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-config" (OuterVolumeSpecName: "config") pod "d1cb0c4a-b8d1-497e-ab96-79da3dc9b255" (UID: "d1cb0c4a-b8d1-497e-ab96-79da3dc9b255"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.583582 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.583629 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.583638 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.583647 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z5f8\" (UniqueName: \"kubernetes.io/projected/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-kube-api-access-8z5f8\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.583659 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.688035 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-s5dg8"] Oct 01 13:11:03 crc kubenswrapper[4851]: E1001 13:11:03.688369 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1cb0c4a-b8d1-497e-ab96-79da3dc9b255" containerName="init" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.688387 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1cb0c4a-b8d1-497e-ab96-79da3dc9b255" containerName="init" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.688600 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1cb0c4a-b8d1-497e-ab96-79da3dc9b255" containerName="init" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.689185 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.690969 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.691098 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.691460 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.699132 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s5dg8"] Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.786839 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-swiftconf\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.786941 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0a8dd5b-b066-4203-b283-3ff979e8da98-scripts\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.786988 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b0a8dd5b-b066-4203-b283-3ff979e8da98-ring-data-devices\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.787068 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmw7x\" (UniqueName: \"kubernetes.io/projected/b0a8dd5b-b066-4203-b283-3ff979e8da98-kube-api-access-dmw7x\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.787111 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-combined-ca-bundle\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.787163 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-dispersionconf\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.787229 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b0a8dd5b-b066-4203-b283-3ff979e8da98-etc-swift\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.888618 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmw7x\" (UniqueName: \"kubernetes.io/projected/b0a8dd5b-b066-4203-b283-3ff979e8da98-kube-api-access-dmw7x\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.888905 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-combined-ca-bundle\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.888931 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-dispersionconf\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.888957 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b0a8dd5b-b066-4203-b283-3ff979e8da98-etc-swift\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.888996 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-swiftconf\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.889044 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0a8dd5b-b066-4203-b283-3ff979e8da98-scripts\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.889066 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b0a8dd5b-b066-4203-b283-3ff979e8da98-ring-data-devices\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.889099 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:03 crc kubenswrapper[4851]: E1001 13:11:03.889234 4851 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 13:11:03 crc kubenswrapper[4851]: E1001 13:11:03.889248 4851 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 13:11:03 crc kubenswrapper[4851]: E1001 13:11:03.889288 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift podName:6a4d7762-af01-4c7e-9641-4b2054a8885d nodeName:}" failed. No retries permitted until 2025-10-01 13:11:04.889274705 +0000 UTC m=+1073.234392191 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift") pod "swift-storage-0" (UID: "6a4d7762-af01-4c7e-9641-4b2054a8885d") : configmap "swift-ring-files" not found Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.890322 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b0a8dd5b-b066-4203-b283-3ff979e8da98-etc-swift\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.890913 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b0a8dd5b-b066-4203-b283-3ff979e8da98-ring-data-devices\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.890952 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0a8dd5b-b066-4203-b283-3ff979e8da98-scripts\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.893626 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-combined-ca-bundle\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.893901 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-swiftconf\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.894196 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-dispersionconf\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:03 crc kubenswrapper[4851]: I1001 13:11:03.906097 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmw7x\" (UniqueName: \"kubernetes.io/projected/b0a8dd5b-b066-4203-b283-3ff979e8da98-kube-api-access-dmw7x\") pod \"swift-ring-rebalance-s5dg8\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.008240 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.110896 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b","Type":"ContainerStarted","Data":"a3d1e60300bb0e7956f63abe098eeb5d83de54e7c738b1183e76cb297b29fbb2"} Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.110934 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"59b6f03e-998a-46c8-a0d7-ce0cbad79b3b","Type":"ContainerStarted","Data":"dfed54d6ed14a2e64e57b07041dec47976b4f5abf442e2ce4d15b4e25d419ff0"} Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.111841 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.115728 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89495b55c-rrgzf" event={"ID":"d1cb0c4a-b8d1-497e-ab96-79da3dc9b255","Type":"ContainerDied","Data":"c9d3b01449270a87b3ea274c86f62119acbe8fedc2ab476383f819369cc2c21d"} Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.115764 4851 scope.go:117] "RemoveContainer" containerID="c76d592329eaca506b8549152f8dfbedf0f4f534de5a45414514c6db4a2663de" Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.115864 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89495b55c-rrgzf" Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.120460 4851 generic.go:334] "Generic (PLEG): container finished" podID="9b0bcde2-7523-41f2-bf95-17f4be1e55de" containerID="29621214da02542afefea5057c85fdba14ff3c75adbd9b26abad60c55e5a212c" exitCode=0 Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.122201 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" event={"ID":"9b0bcde2-7523-41f2-bf95-17f4be1e55de","Type":"ContainerDied","Data":"29621214da02542afefea5057c85fdba14ff3c75adbd9b26abad60c55e5a212c"} Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.133173 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.6625596959999998 podStartE2EDuration="4.133157261s" podCreationTimestamp="2025-10-01 13:11:00 +0000 UTC" firstStartedPulling="2025-10-01 13:11:01.459599612 +0000 UTC m=+1069.804717098" lastFinishedPulling="2025-10-01 13:11:02.930197177 +0000 UTC m=+1071.275314663" observedRunningTime="2025-10-01 13:11:04.12887381 +0000 UTC m=+1072.473991306" watchObservedRunningTime="2025-10-01 13:11:04.133157261 +0000 UTC m=+1072.478274747" Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.272425 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89495b55c-rrgzf"] Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.276308 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89495b55c-rrgzf"] Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.345893 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1cb0c4a-b8d1-497e-ab96-79da3dc9b255" path="/var/lib/kubelet/pods/d1cb0c4a-b8d1-497e-ab96-79da3dc9b255/volumes" Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.515581 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s5dg8"] Oct 01 13:11:04 crc kubenswrapper[4851]: W1001 13:11:04.521030 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0a8dd5b_b066_4203_b283_3ff979e8da98.slice/crio-1a7009f5fd7480e79ce85bbf014b1ca650543e8efb16dc51093331793bacd242 WatchSource:0}: Error finding container 1a7009f5fd7480e79ce85bbf014b1ca650543e8efb16dc51093331793bacd242: Status 404 returned error can't find the container with id 1a7009f5fd7480e79ce85bbf014b1ca650543e8efb16dc51093331793bacd242 Oct 01 13:11:04 crc kubenswrapper[4851]: I1001 13:11:04.911017 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:04 crc kubenswrapper[4851]: E1001 13:11:04.911290 4851 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 13:11:04 crc kubenswrapper[4851]: E1001 13:11:04.911333 4851 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 13:11:04 crc kubenswrapper[4851]: E1001 13:11:04.911385 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift podName:6a4d7762-af01-4c7e-9641-4b2054a8885d nodeName:}" failed. No retries permitted until 2025-10-01 13:11:06.911368491 +0000 UTC m=+1075.256485977 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift") pod "swift-storage-0" (UID: "6a4d7762-af01-4c7e-9641-4b2054a8885d") : configmap "swift-ring-files" not found Oct 01 13:11:05 crc kubenswrapper[4851]: I1001 13:11:05.138482 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" event={"ID":"9b0bcde2-7523-41f2-bf95-17f4be1e55de","Type":"ContainerStarted","Data":"30e5d47cf1ecb5ce3426302c5a3b0bedb12f82a2d45145740bfbfaec971d8312"} Oct 01 13:11:05 crc kubenswrapper[4851]: I1001 13:11:05.139350 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:05 crc kubenswrapper[4851]: I1001 13:11:05.141548 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s5dg8" event={"ID":"b0a8dd5b-b066-4203-b283-3ff979e8da98","Type":"ContainerStarted","Data":"1a7009f5fd7480e79ce85bbf014b1ca650543e8efb16dc51093331793bacd242"} Oct 01 13:11:05 crc kubenswrapper[4851]: I1001 13:11:05.163446 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" podStartSLOduration=4.16342688 podStartE2EDuration="4.16342688s" podCreationTimestamp="2025-10-01 13:11:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:11:05.158408087 +0000 UTC m=+1073.503525573" watchObservedRunningTime="2025-10-01 13:11:05.16342688 +0000 UTC m=+1073.508544366" Oct 01 13:11:06 crc kubenswrapper[4851]: I1001 13:11:06.955843 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:06 crc kubenswrapper[4851]: E1001 13:11:06.956136 4851 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 13:11:06 crc kubenswrapper[4851]: E1001 13:11:06.956349 4851 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 13:11:06 crc kubenswrapper[4851]: E1001 13:11:06.956419 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift podName:6a4d7762-af01-4c7e-9641-4b2054a8885d nodeName:}" failed. No retries permitted until 2025-10-01 13:11:10.95639417 +0000 UTC m=+1079.301511696 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift") pod "swift-storage-0" (UID: "6a4d7762-af01-4c7e-9641-4b2054a8885d") : configmap "swift-ring-files" not found Oct 01 13:11:07 crc kubenswrapper[4851]: I1001 13:11:07.580033 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 01 13:11:07 crc kubenswrapper[4851]: I1001 13:11:07.684766 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 01 13:11:09 crc kubenswrapper[4851]: I1001 13:11:09.442188 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 01 13:11:09 crc kubenswrapper[4851]: I1001 13:11:09.442742 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 01 13:11:09 crc kubenswrapper[4851]: I1001 13:11:09.504272 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 01 13:11:10 crc kubenswrapper[4851]: I1001 13:11:10.288054 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 01 13:11:11 crc kubenswrapper[4851]: I1001 13:11:11.036749 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:11 crc kubenswrapper[4851]: E1001 13:11:11.036916 4851 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 13:11:11 crc kubenswrapper[4851]: E1001 13:11:11.037190 4851 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 13:11:11 crc kubenswrapper[4851]: E1001 13:11:11.037240 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift podName:6a4d7762-af01-4c7e-9641-4b2054a8885d nodeName:}" failed. No retries permitted until 2025-10-01 13:11:19.037221084 +0000 UTC m=+1087.382338570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift") pod "swift-storage-0" (UID: "6a4d7762-af01-4c7e-9641-4b2054a8885d") : configmap "swift-ring-files" not found Oct 01 13:11:11 crc kubenswrapper[4851]: I1001 13:11:11.949741 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-wp2hj"] Oct 01 13:11:11 crc kubenswrapper[4851]: I1001 13:11:11.950905 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-wp2hj" Oct 01 13:11:11 crc kubenswrapper[4851]: I1001 13:11:11.985465 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-wp2hj"] Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.052865 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29qtc\" (UniqueName: \"kubernetes.io/projected/e6493c0f-dc39-4b1b-a93e-e8a4bf323729-kube-api-access-29qtc\") pod \"watcher-db-create-wp2hj\" (UID: \"e6493c0f-dc39-4b1b-a93e-e8a4bf323729\") " pod="openstack/watcher-db-create-wp2hj" Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.154401 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29qtc\" (UniqueName: \"kubernetes.io/projected/e6493c0f-dc39-4b1b-a93e-e8a4bf323729-kube-api-access-29qtc\") pod \"watcher-db-create-wp2hj\" (UID: \"e6493c0f-dc39-4b1b-a93e-e8a4bf323729\") " pod="openstack/watcher-db-create-wp2hj" Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.170963 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29qtc\" (UniqueName: \"kubernetes.io/projected/e6493c0f-dc39-4b1b-a93e-e8a4bf323729-kube-api-access-29qtc\") pod \"watcher-db-create-wp2hj\" (UID: \"e6493c0f-dc39-4b1b-a93e-e8a4bf323729\") " pod="openstack/watcher-db-create-wp2hj" Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.214665 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s5dg8" event={"ID":"b0a8dd5b-b066-4203-b283-3ff979e8da98","Type":"ContainerStarted","Data":"3130af2c0e4ac409158ac7f1a657cb0a4d8b56465a2d28815a4fc934159e83e9"} Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.216646 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"486b3f7c-3593-4950-b470-2d0a2f037e2f","Type":"ContainerStarted","Data":"5d3a504e2fc801c80023dc63dd412e85d75002d50155ac11fc8476348268197a"} Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.269274 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-wp2hj" Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.321693 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.421660 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56595f9cf7-29fbx"] Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.421849 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" podUID="af2859ca-d690-49fb-8dd6-2b50441aa577" containerName="dnsmasq-dns" containerID="cri-o://d7715ee4abd20db674ef93a19e0cb5be83e147c9ecfc904a30f8f134ebe9bf01" gracePeriod=10 Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.807573 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-wp2hj"] Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.889717 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.972025 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n7vz\" (UniqueName: \"kubernetes.io/projected/af2859ca-d690-49fb-8dd6-2b50441aa577-kube-api-access-9n7vz\") pod \"af2859ca-d690-49fb-8dd6-2b50441aa577\" (UID: \"af2859ca-d690-49fb-8dd6-2b50441aa577\") " Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.972172 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af2859ca-d690-49fb-8dd6-2b50441aa577-config\") pod \"af2859ca-d690-49fb-8dd6-2b50441aa577\" (UID: \"af2859ca-d690-49fb-8dd6-2b50441aa577\") " Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.972273 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af2859ca-d690-49fb-8dd6-2b50441aa577-dns-svc\") pod \"af2859ca-d690-49fb-8dd6-2b50441aa577\" (UID: \"af2859ca-d690-49fb-8dd6-2b50441aa577\") " Oct 01 13:11:12 crc kubenswrapper[4851]: I1001 13:11:12.992726 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af2859ca-d690-49fb-8dd6-2b50441aa577-kube-api-access-9n7vz" (OuterVolumeSpecName: "kube-api-access-9n7vz") pod "af2859ca-d690-49fb-8dd6-2b50441aa577" (UID: "af2859ca-d690-49fb-8dd6-2b50441aa577"). InnerVolumeSpecName "kube-api-access-9n7vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.011188 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af2859ca-d690-49fb-8dd6-2b50441aa577-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af2859ca-d690-49fb-8dd6-2b50441aa577" (UID: "af2859ca-d690-49fb-8dd6-2b50441aa577"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.024989 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af2859ca-d690-49fb-8dd6-2b50441aa577-config" (OuterVolumeSpecName: "config") pod "af2859ca-d690-49fb-8dd6-2b50441aa577" (UID: "af2859ca-d690-49fb-8dd6-2b50441aa577"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.074726 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af2859ca-d690-49fb-8dd6-2b50441aa577-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.074766 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n7vz\" (UniqueName: \"kubernetes.io/projected/af2859ca-d690-49fb-8dd6-2b50441aa577-kube-api-access-9n7vz\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.074782 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af2859ca-d690-49fb-8dd6-2b50441aa577-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.230416 4851 generic.go:334] "Generic (PLEG): container finished" podID="af2859ca-d690-49fb-8dd6-2b50441aa577" containerID="d7715ee4abd20db674ef93a19e0cb5be83e147c9ecfc904a30f8f134ebe9bf01" exitCode=0 Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.230568 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.231217 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" event={"ID":"af2859ca-d690-49fb-8dd6-2b50441aa577","Type":"ContainerDied","Data":"d7715ee4abd20db674ef93a19e0cb5be83e147c9ecfc904a30f8f134ebe9bf01"} Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.231293 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56595f9cf7-29fbx" event={"ID":"af2859ca-d690-49fb-8dd6-2b50441aa577","Type":"ContainerDied","Data":"6efc1c43f7a68d92c2133143c9638fb3c8d859955320e27052f7e1bd94529ca2"} Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.231328 4851 scope.go:117] "RemoveContainer" containerID="d7715ee4abd20db674ef93a19e0cb5be83e147c9ecfc904a30f8f134ebe9bf01" Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.235314 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-wp2hj" event={"ID":"e6493c0f-dc39-4b1b-a93e-e8a4bf323729","Type":"ContainerStarted","Data":"7e57ccb5ac69cee39e1b211bb737f3c602e5cdf9e62fc6623cceb2819dfceee1"} Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.235371 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-wp2hj" event={"ID":"e6493c0f-dc39-4b1b-a93e-e8a4bf323729","Type":"ContainerStarted","Data":"eb0b71a790372905126afece5db11b8f83c9111c2c5bd6d54ab67d0d3efe3d28"} Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.251846 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-s5dg8" podStartSLOduration=2.804586388 podStartE2EDuration="10.251820447s" podCreationTimestamp="2025-10-01 13:11:03 +0000 UTC" firstStartedPulling="2025-10-01 13:11:04.522818916 +0000 UTC m=+1072.867936402" lastFinishedPulling="2025-10-01 13:11:11.970052975 +0000 UTC m=+1080.315170461" observedRunningTime="2025-10-01 13:11:13.25087003 +0000 UTC m=+1081.595987546" watchObservedRunningTime="2025-10-01 13:11:13.251820447 +0000 UTC m=+1081.596937953" Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.261685 4851 scope.go:117] "RemoveContainer" containerID="4ae053a22d1220c462d4dc1dc6d796cd3bdc4f6fa977347571ead9987a93889e" Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.273827 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-create-wp2hj" podStartSLOduration=2.273807552 podStartE2EDuration="2.273807552s" podCreationTimestamp="2025-10-01 13:11:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:11:13.270080766 +0000 UTC m=+1081.615198292" watchObservedRunningTime="2025-10-01 13:11:13.273807552 +0000 UTC m=+1081.618925048" Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.298086 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56595f9cf7-29fbx"] Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.298135 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56595f9cf7-29fbx"] Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.304862 4851 scope.go:117] "RemoveContainer" containerID="d7715ee4abd20db674ef93a19e0cb5be83e147c9ecfc904a30f8f134ebe9bf01" Oct 01 13:11:13 crc kubenswrapper[4851]: E1001 13:11:13.305465 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7715ee4abd20db674ef93a19e0cb5be83e147c9ecfc904a30f8f134ebe9bf01\": container with ID starting with d7715ee4abd20db674ef93a19e0cb5be83e147c9ecfc904a30f8f134ebe9bf01 not found: ID does not exist" containerID="d7715ee4abd20db674ef93a19e0cb5be83e147c9ecfc904a30f8f134ebe9bf01" Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.305703 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7715ee4abd20db674ef93a19e0cb5be83e147c9ecfc904a30f8f134ebe9bf01"} err="failed to get container status \"d7715ee4abd20db674ef93a19e0cb5be83e147c9ecfc904a30f8f134ebe9bf01\": rpc error: code = NotFound desc = could not find container \"d7715ee4abd20db674ef93a19e0cb5be83e147c9ecfc904a30f8f134ebe9bf01\": container with ID starting with d7715ee4abd20db674ef93a19e0cb5be83e147c9ecfc904a30f8f134ebe9bf01 not found: ID does not exist" Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.305740 4851 scope.go:117] "RemoveContainer" containerID="4ae053a22d1220c462d4dc1dc6d796cd3bdc4f6fa977347571ead9987a93889e" Oct 01 13:11:13 crc kubenswrapper[4851]: E1001 13:11:13.306208 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae053a22d1220c462d4dc1dc6d796cd3bdc4f6fa977347571ead9987a93889e\": container with ID starting with 4ae053a22d1220c462d4dc1dc6d796cd3bdc4f6fa977347571ead9987a93889e not found: ID does not exist" containerID="4ae053a22d1220c462d4dc1dc6d796cd3bdc4f6fa977347571ead9987a93889e" Oct 01 13:11:13 crc kubenswrapper[4851]: I1001 13:11:13.306241 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae053a22d1220c462d4dc1dc6d796cd3bdc4f6fa977347571ead9987a93889e"} err="failed to get container status \"4ae053a22d1220c462d4dc1dc6d796cd3bdc4f6fa977347571ead9987a93889e\": rpc error: code = NotFound desc = could not find container \"4ae053a22d1220c462d4dc1dc6d796cd3bdc4f6fa977347571ead9987a93889e\": container with ID starting with 4ae053a22d1220c462d4dc1dc6d796cd3bdc4f6fa977347571ead9987a93889e not found: ID does not exist" Oct 01 13:11:14 crc kubenswrapper[4851]: I1001 13:11:14.247588 4851 generic.go:334] "Generic (PLEG): container finished" podID="e6493c0f-dc39-4b1b-a93e-e8a4bf323729" containerID="7e57ccb5ac69cee39e1b211bb737f3c602e5cdf9e62fc6623cceb2819dfceee1" exitCode=0 Oct 01 13:11:14 crc kubenswrapper[4851]: I1001 13:11:14.247815 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-wp2hj" event={"ID":"e6493c0f-dc39-4b1b-a93e-e8a4bf323729","Type":"ContainerDied","Data":"7e57ccb5ac69cee39e1b211bb737f3c602e5cdf9e62fc6623cceb2819dfceee1"} Oct 01 13:11:14 crc kubenswrapper[4851]: I1001 13:11:14.347275 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af2859ca-d690-49fb-8dd6-2b50441aa577" path="/var/lib/kubelet/pods/af2859ca-d690-49fb-8dd6-2b50441aa577/volumes" Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.258213 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"486b3f7c-3593-4950-b470-2d0a2f037e2f","Type":"ContainerStarted","Data":"f05fb51d2a98fdac4d8fd593f16ceb782f33265a66248db8be1c3c06402958b1"} Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.291411 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-swbm8"] Oct 01 13:11:15 crc kubenswrapper[4851]: E1001 13:11:15.291787 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2859ca-d690-49fb-8dd6-2b50441aa577" containerName="init" Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.291803 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2859ca-d690-49fb-8dd6-2b50441aa577" containerName="init" Oct 01 13:11:15 crc kubenswrapper[4851]: E1001 13:11:15.291844 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2859ca-d690-49fb-8dd6-2b50441aa577" containerName="dnsmasq-dns" Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.291850 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2859ca-d690-49fb-8dd6-2b50441aa577" containerName="dnsmasq-dns" Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.292039 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2859ca-d690-49fb-8dd6-2b50441aa577" containerName="dnsmasq-dns" Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.292590 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-swbm8" Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.309357 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-swbm8"] Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.320435 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh68x\" (UniqueName: \"kubernetes.io/projected/891a835a-a012-4d76-a61f-393fbef5692f-kube-api-access-wh68x\") pod \"glance-db-create-swbm8\" (UID: \"891a835a-a012-4d76-a61f-393fbef5692f\") " pod="openstack/glance-db-create-swbm8" Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.424460 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh68x\" (UniqueName: \"kubernetes.io/projected/891a835a-a012-4d76-a61f-393fbef5692f-kube-api-access-wh68x\") pod \"glance-db-create-swbm8\" (UID: \"891a835a-a012-4d76-a61f-393fbef5692f\") " pod="openstack/glance-db-create-swbm8" Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.452340 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh68x\" (UniqueName: \"kubernetes.io/projected/891a835a-a012-4d76-a61f-393fbef5692f-kube-api-access-wh68x\") pod \"glance-db-create-swbm8\" (UID: \"891a835a-a012-4d76-a61f-393fbef5692f\") " pod="openstack/glance-db-create-swbm8" Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.654770 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-swbm8" Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.788932 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-wp2hj" Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.933059 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29qtc\" (UniqueName: \"kubernetes.io/projected/e6493c0f-dc39-4b1b-a93e-e8a4bf323729-kube-api-access-29qtc\") pod \"e6493c0f-dc39-4b1b-a93e-e8a4bf323729\" (UID: \"e6493c0f-dc39-4b1b-a93e-e8a4bf323729\") " Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.942162 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6493c0f-dc39-4b1b-a93e-e8a4bf323729-kube-api-access-29qtc" (OuterVolumeSpecName: "kube-api-access-29qtc") pod "e6493c0f-dc39-4b1b-a93e-e8a4bf323729" (UID: "e6493c0f-dc39-4b1b-a93e-e8a4bf323729"). InnerVolumeSpecName "kube-api-access-29qtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:15 crc kubenswrapper[4851]: I1001 13:11:15.988364 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 01 13:11:16 crc kubenswrapper[4851]: I1001 13:11:16.035534 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29qtc\" (UniqueName: \"kubernetes.io/projected/e6493c0f-dc39-4b1b-a93e-e8a4bf323729-kube-api-access-29qtc\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:16 crc kubenswrapper[4851]: I1001 13:11:16.126734 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-swbm8"] Oct 01 13:11:16 crc kubenswrapper[4851]: I1001 13:11:16.267818 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-wp2hj" event={"ID":"e6493c0f-dc39-4b1b-a93e-e8a4bf323729","Type":"ContainerDied","Data":"eb0b71a790372905126afece5db11b8f83c9111c2c5bd6d54ab67d0d3efe3d28"} Oct 01 13:11:16 crc kubenswrapper[4851]: I1001 13:11:16.268203 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb0b71a790372905126afece5db11b8f83c9111c2c5bd6d54ab67d0d3efe3d28" Oct 01 13:11:16 crc kubenswrapper[4851]: I1001 13:11:16.267839 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-wp2hj" Oct 01 13:11:16 crc kubenswrapper[4851]: I1001 13:11:16.269169 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-swbm8" event={"ID":"891a835a-a012-4d76-a61f-393fbef5692f","Type":"ContainerStarted","Data":"8122fbb1523ae572e85c84169d42b640930a38fc43a8210767bddf05f340a2a1"} Oct 01 13:11:17 crc kubenswrapper[4851]: I1001 13:11:17.280231 4851 generic.go:334] "Generic (PLEG): container finished" podID="891a835a-a012-4d76-a61f-393fbef5692f" containerID="486b3f2726dcbf09db1c1805cbe67352db45f6f7e4876ae95bd5ee9e936d63fc" exitCode=0 Oct 01 13:11:17 crc kubenswrapper[4851]: I1001 13:11:17.280347 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-swbm8" event={"ID":"891a835a-a012-4d76-a61f-393fbef5692f","Type":"ContainerDied","Data":"486b3f2726dcbf09db1c1805cbe67352db45f6f7e4876ae95bd5ee9e936d63fc"} Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.103650 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:19 crc kubenswrapper[4851]: E1001 13:11:19.103857 4851 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 13:11:19 crc kubenswrapper[4851]: E1001 13:11:19.104091 4851 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 13:11:19 crc kubenswrapper[4851]: E1001 13:11:19.104148 4851 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift podName:6a4d7762-af01-4c7e-9641-4b2054a8885d nodeName:}" failed. No retries permitted until 2025-10-01 13:11:35.10412901 +0000 UTC m=+1103.449246486 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift") pod "swift-storage-0" (UID: "6a4d7762-af01-4c7e-9641-4b2054a8885d") : configmap "swift-ring-files" not found Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.629155 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-swbm8" Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.694865 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-dkxdn"] Oct 01 13:11:19 crc kubenswrapper[4851]: E1001 13:11:19.695388 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891a835a-a012-4d76-a61f-393fbef5692f" containerName="mariadb-database-create" Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.695422 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="891a835a-a012-4d76-a61f-393fbef5692f" containerName="mariadb-database-create" Oct 01 13:11:19 crc kubenswrapper[4851]: E1001 13:11:19.695439 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6493c0f-dc39-4b1b-a93e-e8a4bf323729" containerName="mariadb-database-create" Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.695451 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6493c0f-dc39-4b1b-a93e-e8a4bf323729" containerName="mariadb-database-create" Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.695761 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="891a835a-a012-4d76-a61f-393fbef5692f" containerName="mariadb-database-create" Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.695833 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6493c0f-dc39-4b1b-a93e-e8a4bf323729" containerName="mariadb-database-create" Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.696729 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dkxdn" Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.700824 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dkxdn"] Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.823570 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh68x\" (UniqueName: \"kubernetes.io/projected/891a835a-a012-4d76-a61f-393fbef5692f-kube-api-access-wh68x\") pod \"891a835a-a012-4d76-a61f-393fbef5692f\" (UID: \"891a835a-a012-4d76-a61f-393fbef5692f\") " Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.823990 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vzhg\" (UniqueName: \"kubernetes.io/projected/ef457f3c-0b63-4180-a5a7-bbf92e55a195-kube-api-access-5vzhg\") pod \"keystone-db-create-dkxdn\" (UID: \"ef457f3c-0b63-4180-a5a7-bbf92e55a195\") " pod="openstack/keystone-db-create-dkxdn" Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.829857 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/891a835a-a012-4d76-a61f-393fbef5692f-kube-api-access-wh68x" (OuterVolumeSpecName: "kube-api-access-wh68x") pod "891a835a-a012-4d76-a61f-393fbef5692f" (UID: "891a835a-a012-4d76-a61f-393fbef5692f"). InnerVolumeSpecName "kube-api-access-wh68x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.925443 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vzhg\" (UniqueName: \"kubernetes.io/projected/ef457f3c-0b63-4180-a5a7-bbf92e55a195-kube-api-access-5vzhg\") pod \"keystone-db-create-dkxdn\" (UID: \"ef457f3c-0b63-4180-a5a7-bbf92e55a195\") " pod="openstack/keystone-db-create-dkxdn" Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.925735 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh68x\" (UniqueName: \"kubernetes.io/projected/891a835a-a012-4d76-a61f-393fbef5692f-kube-api-access-wh68x\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.952137 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vzhg\" (UniqueName: \"kubernetes.io/projected/ef457f3c-0b63-4180-a5a7-bbf92e55a195-kube-api-access-5vzhg\") pod \"keystone-db-create-dkxdn\" (UID: \"ef457f3c-0b63-4180-a5a7-bbf92e55a195\") " pod="openstack/keystone-db-create-dkxdn" Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.953131 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-qtxhk"] Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.955225 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qtxhk" Oct 01 13:11:19 crc kubenswrapper[4851]: I1001 13:11:19.963244 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-qtxhk"] Oct 01 13:11:20 crc kubenswrapper[4851]: I1001 13:11:20.015208 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dkxdn" Oct 01 13:11:20 crc kubenswrapper[4851]: I1001 13:11:20.128575 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzrvz\" (UniqueName: \"kubernetes.io/projected/079f3beb-593f-4866-ae54-8e1fb880c9b1-kube-api-access-bzrvz\") pod \"placement-db-create-qtxhk\" (UID: \"079f3beb-593f-4866-ae54-8e1fb880c9b1\") " pod="openstack/placement-db-create-qtxhk" Oct 01 13:11:20 crc kubenswrapper[4851]: I1001 13:11:20.130311 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dmq5k" podUID="7b15e5b7-162e-46ee-a292-bf763704cda6" containerName="ovn-controller" probeResult="failure" output=< Oct 01 13:11:20 crc kubenswrapper[4851]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 13:11:20 crc kubenswrapper[4851]: > Oct 01 13:11:20 crc kubenswrapper[4851]: I1001 13:11:20.231202 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzrvz\" (UniqueName: \"kubernetes.io/projected/079f3beb-593f-4866-ae54-8e1fb880c9b1-kube-api-access-bzrvz\") pod \"placement-db-create-qtxhk\" (UID: \"079f3beb-593f-4866-ae54-8e1fb880c9b1\") " pod="openstack/placement-db-create-qtxhk" Oct 01 13:11:20 crc kubenswrapper[4851]: I1001 13:11:20.254922 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzrvz\" (UniqueName: \"kubernetes.io/projected/079f3beb-593f-4866-ae54-8e1fb880c9b1-kube-api-access-bzrvz\") pod \"placement-db-create-qtxhk\" (UID: \"079f3beb-593f-4866-ae54-8e1fb880c9b1\") " pod="openstack/placement-db-create-qtxhk" Oct 01 13:11:20 crc kubenswrapper[4851]: I1001 13:11:20.309014 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-swbm8" event={"ID":"891a835a-a012-4d76-a61f-393fbef5692f","Type":"ContainerDied","Data":"8122fbb1523ae572e85c84169d42b640930a38fc43a8210767bddf05f340a2a1"} Oct 01 13:11:20 crc kubenswrapper[4851]: I1001 13:11:20.309239 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8122fbb1523ae572e85c84169d42b640930a38fc43a8210767bddf05f340a2a1" Oct 01 13:11:20 crc kubenswrapper[4851]: I1001 13:11:20.309298 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-swbm8" Oct 01 13:11:20 crc kubenswrapper[4851]: I1001 13:11:20.398067 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qtxhk" Oct 01 13:11:20 crc kubenswrapper[4851]: I1001 13:11:20.601187 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dkxdn"] Oct 01 13:11:20 crc kubenswrapper[4851]: W1001 13:11:20.692114 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef457f3c_0b63_4180_a5a7_bbf92e55a195.slice/crio-1c305fde4d9be3c6dc6d5d73ad6c9ed3bb96644df85b4ae92c0519147a4e2167 WatchSource:0}: Error finding container 1c305fde4d9be3c6dc6d5d73ad6c9ed3bb96644df85b4ae92c0519147a4e2167: Status 404 returned error can't find the container with id 1c305fde4d9be3c6dc6d5d73ad6c9ed3bb96644df85b4ae92c0519147a4e2167 Oct 01 13:11:20 crc kubenswrapper[4851]: I1001 13:11:20.865437 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-qtxhk"] Oct 01 13:11:21 crc kubenswrapper[4851]: I1001 13:11:21.320918 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dkxdn" event={"ID":"ef457f3c-0b63-4180-a5a7-bbf92e55a195","Type":"ContainerStarted","Data":"1c305fde4d9be3c6dc6d5d73ad6c9ed3bb96644df85b4ae92c0519147a4e2167"} Oct 01 13:11:21 crc kubenswrapper[4851]: I1001 13:11:21.322646 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qtxhk" event={"ID":"079f3beb-593f-4866-ae54-8e1fb880c9b1","Type":"ContainerStarted","Data":"fdd62f94b8c12388727bc4aefadfc3a7eab1aaa6b22c346cabae80d2819c3f07"} Oct 01 13:11:21 crc kubenswrapper[4851]: I1001 13:11:21.987634 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-d2a4-account-create-gvkcm"] Oct 01 13:11:21 crc kubenswrapper[4851]: I1001 13:11:21.990067 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-d2a4-account-create-gvkcm" Oct 01 13:11:21 crc kubenswrapper[4851]: I1001 13:11:21.994820 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Oct 01 13:11:22 crc kubenswrapper[4851]: I1001 13:11:22.021115 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-d2a4-account-create-gvkcm"] Oct 01 13:11:22 crc kubenswrapper[4851]: I1001 13:11:22.081203 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tf9r\" (UniqueName: \"kubernetes.io/projected/6b8bb329-e08f-415d-a643-94cffe80471b-kube-api-access-5tf9r\") pod \"watcher-d2a4-account-create-gvkcm\" (UID: \"6b8bb329-e08f-415d-a643-94cffe80471b\") " pod="openstack/watcher-d2a4-account-create-gvkcm" Oct 01 13:11:22 crc kubenswrapper[4851]: I1001 13:11:22.183276 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tf9r\" (UniqueName: \"kubernetes.io/projected/6b8bb329-e08f-415d-a643-94cffe80471b-kube-api-access-5tf9r\") pod \"watcher-d2a4-account-create-gvkcm\" (UID: \"6b8bb329-e08f-415d-a643-94cffe80471b\") " pod="openstack/watcher-d2a4-account-create-gvkcm" Oct 01 13:11:22 crc kubenswrapper[4851]: I1001 13:11:22.213563 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tf9r\" (UniqueName: \"kubernetes.io/projected/6b8bb329-e08f-415d-a643-94cffe80471b-kube-api-access-5tf9r\") pod \"watcher-d2a4-account-create-gvkcm\" (UID: \"6b8bb329-e08f-415d-a643-94cffe80471b\") " pod="openstack/watcher-d2a4-account-create-gvkcm" Oct 01 13:11:22 crc kubenswrapper[4851]: I1001 13:11:22.321876 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-d2a4-account-create-gvkcm" Oct 01 13:11:22 crc kubenswrapper[4851]: I1001 13:11:22.344365 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qtxhk" event={"ID":"079f3beb-593f-4866-ae54-8e1fb880c9b1","Type":"ContainerStarted","Data":"14d7041e98802f4e82c687617bf7cff5eaf16695c26dc2603dbc83300f24e6a0"} Oct 01 13:11:22 crc kubenswrapper[4851]: I1001 13:11:22.344404 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dkxdn" event={"ID":"ef457f3c-0b63-4180-a5a7-bbf92e55a195","Type":"ContainerStarted","Data":"0f567ca023d9a1094a3c2b1cea88bd2a96b1d3ae1079ddbe2b798507a43c05de"} Oct 01 13:11:22 crc kubenswrapper[4851]: I1001 13:11:22.424976 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-dkxdn" podStartSLOduration=3.42494812 podStartE2EDuration="3.42494812s" podCreationTimestamp="2025-10-01 13:11:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:11:22.396534873 +0000 UTC m=+1090.741652379" watchObservedRunningTime="2025-10-01 13:11:22.42494812 +0000 UTC m=+1090.770065636" Oct 01 13:11:22 crc kubenswrapper[4851]: I1001 13:11:22.454094 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-qtxhk" podStartSLOduration=3.4540672470000002 podStartE2EDuration="3.454067247s" podCreationTimestamp="2025-10-01 13:11:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:11:22.410820409 +0000 UTC m=+1090.755937925" watchObservedRunningTime="2025-10-01 13:11:22.454067247 +0000 UTC m=+1090.799184733" Oct 01 13:11:22 crc kubenswrapper[4851]: I1001 13:11:22.778672 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-d2a4-account-create-gvkcm"] Oct 01 13:11:22 crc kubenswrapper[4851]: W1001 13:11:22.827839 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b8bb329_e08f_415d_a643_94cffe80471b.slice/crio-12bfc14daf768b6707b56a570ee1814824af37572207b39e033a2a848c474948 WatchSource:0}: Error finding container 12bfc14daf768b6707b56a570ee1814824af37572207b39e033a2a848c474948: Status 404 returned error can't find the container with id 12bfc14daf768b6707b56a570ee1814824af37572207b39e033a2a848c474948 Oct 01 13:11:23 crc kubenswrapper[4851]: I1001 13:11:23.350062 4851 generic.go:334] "Generic (PLEG): container finished" podID="95c53639-696f-4d10-a297-7173dd3b394f" containerID="83ea666e82ffc032b51fb419fe2f3e2cbf43bf29cd27389317204ca318397904" exitCode=0 Oct 01 13:11:23 crc kubenswrapper[4851]: I1001 13:11:23.350187 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"95c53639-696f-4d10-a297-7173dd3b394f","Type":"ContainerDied","Data":"83ea666e82ffc032b51fb419fe2f3e2cbf43bf29cd27389317204ca318397904"} Oct 01 13:11:23 crc kubenswrapper[4851]: I1001 13:11:23.353437 4851 generic.go:334] "Generic (PLEG): container finished" podID="712c3704-a775-4fac-81d6-9aa9cfdc48ef" containerID="413d60ea21361b7065a174a88bcb5289e9879147b33e039ab6917e97367daef4" exitCode=0 Oct 01 13:11:23 crc kubenswrapper[4851]: I1001 13:11:23.353549 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"712c3704-a775-4fac-81d6-9aa9cfdc48ef","Type":"ContainerDied","Data":"413d60ea21361b7065a174a88bcb5289e9879147b33e039ab6917e97367daef4"} Oct 01 13:11:23 crc kubenswrapper[4851]: I1001 13:11:23.356405 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-d2a4-account-create-gvkcm" event={"ID":"6b8bb329-e08f-415d-a643-94cffe80471b","Type":"ContainerStarted","Data":"12bfc14daf768b6707b56a570ee1814824af37572207b39e033a2a848c474948"} Oct 01 13:11:23 crc kubenswrapper[4851]: I1001 13:11:23.358012 4851 generic.go:334] "Generic (PLEG): container finished" podID="8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb" containerID="614a0e65fe50b13ce4e61379c18ac00726f1b2b0fa03c62cd2708f05b65907e3" exitCode=0 Oct 01 13:11:23 crc kubenswrapper[4851]: I1001 13:11:23.358155 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb","Type":"ContainerDied","Data":"614a0e65fe50b13ce4e61379c18ac00726f1b2b0fa03c62cd2708f05b65907e3"} Oct 01 13:11:23 crc kubenswrapper[4851]: I1001 13:11:23.359584 4851 generic.go:334] "Generic (PLEG): container finished" podID="b0a8dd5b-b066-4203-b283-3ff979e8da98" containerID="3130af2c0e4ac409158ac7f1a657cb0a4d8b56465a2d28815a4fc934159e83e9" exitCode=0 Oct 01 13:11:23 crc kubenswrapper[4851]: I1001 13:11:23.359667 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s5dg8" event={"ID":"b0a8dd5b-b066-4203-b283-3ff979e8da98","Type":"ContainerDied","Data":"3130af2c0e4ac409158ac7f1a657cb0a4d8b56465a2d28815a4fc934159e83e9"} Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.370412 4851 generic.go:334] "Generic (PLEG): container finished" podID="6b8bb329-e08f-415d-a643-94cffe80471b" containerID="9f34893d354cd86fa202b84bf53091ae4b85961dfc88e0f14e5b5d191399abf3" exitCode=0 Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.370462 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-d2a4-account-create-gvkcm" event={"ID":"6b8bb329-e08f-415d-a643-94cffe80471b","Type":"ContainerDied","Data":"9f34893d354cd86fa202b84bf53091ae4b85961dfc88e0f14e5b5d191399abf3"} Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.373473 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb","Type":"ContainerStarted","Data":"c9b2276ee64cee82be501e84c7b2a652f70982084f83e2a8eacf7ecd4ca61b1e"} Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.373766 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.376715 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"95c53639-696f-4d10-a297-7173dd3b394f","Type":"ContainerStarted","Data":"484a61f0f7ac368499b822805fd5cb3b85d684289d2657f90c7f00476f9f1707"} Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.376943 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.379062 4851 generic.go:334] "Generic (PLEG): container finished" podID="079f3beb-593f-4866-ae54-8e1fb880c9b1" containerID="14d7041e98802f4e82c687617bf7cff5eaf16695c26dc2603dbc83300f24e6a0" exitCode=0 Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.379097 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qtxhk" event={"ID":"079f3beb-593f-4866-ae54-8e1fb880c9b1","Type":"ContainerDied","Data":"14d7041e98802f4e82c687617bf7cff5eaf16695c26dc2603dbc83300f24e6a0"} Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.380957 4851 generic.go:334] "Generic (PLEG): container finished" podID="ef457f3c-0b63-4180-a5a7-bbf92e55a195" containerID="0f567ca023d9a1094a3c2b1cea88bd2a96b1d3ae1079ddbe2b798507a43c05de" exitCode=0 Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.380994 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dkxdn" event={"ID":"ef457f3c-0b63-4180-a5a7-bbf92e55a195","Type":"ContainerDied","Data":"0f567ca023d9a1094a3c2b1cea88bd2a96b1d3ae1079ddbe2b798507a43c05de"} Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.382981 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"712c3704-a775-4fac-81d6-9aa9cfdc48ef","Type":"ContainerStarted","Data":"ed5aa2aa3d0499ef8bdca55947cbc7697b854f3673f0b90ed8759cfa5950d638"} Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.383205 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.386010 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"486b3f7c-3593-4950-b470-2d0a2f037e2f","Type":"ContainerStarted","Data":"46100b329bd535edf0d5df7c0f0343220da44cc5bdb743cf098c51aa229456e0"} Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.435433 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=-9223371966.419363 podStartE2EDuration="1m10.435412477s" podCreationTimestamp="2025-10-01 13:10:14 +0000 UTC" firstStartedPulling="2025-10-01 13:10:31.351716989 +0000 UTC m=+1039.696834475" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:11:24.41828071 +0000 UTC m=+1092.763398226" watchObservedRunningTime="2025-10-01 13:11:24.435412477 +0000 UTC m=+1092.780529973" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.450687 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=12.097728915 podStartE2EDuration="1m3.45067007s" podCreationTimestamp="2025-10-01 13:10:21 +0000 UTC" firstStartedPulling="2025-10-01 13:10:32.261173947 +0000 UTC m=+1040.606291433" lastFinishedPulling="2025-10-01 13:11:23.614115102 +0000 UTC m=+1091.959232588" observedRunningTime="2025-10-01 13:11:24.447882921 +0000 UTC m=+1092.793000397" watchObservedRunningTime="2025-10-01 13:11:24.45067007 +0000 UTC m=+1092.795787556" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.508787 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=56.810287939 podStartE2EDuration="1m10.50876359s" podCreationTimestamp="2025-10-01 13:10:14 +0000 UTC" firstStartedPulling="2025-10-01 13:10:32.110993733 +0000 UTC m=+1040.456111219" lastFinishedPulling="2025-10-01 13:10:45.809469354 +0000 UTC m=+1054.154586870" observedRunningTime="2025-10-01 13:11:24.501948967 +0000 UTC m=+1092.847066493" watchObservedRunningTime="2025-10-01 13:11:24.50876359 +0000 UTC m=+1092.853881086" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.523172 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=54.673913047 podStartE2EDuration="1m9.523152979s" podCreationTimestamp="2025-10-01 13:10:15 +0000 UTC" firstStartedPulling="2025-10-01 13:10:32.089733009 +0000 UTC m=+1040.434850495" lastFinishedPulling="2025-10-01 13:10:46.938972891 +0000 UTC m=+1055.284090427" observedRunningTime="2025-10-01 13:11:24.521343878 +0000 UTC m=+1092.866461364" watchObservedRunningTime="2025-10-01 13:11:24.523152979 +0000 UTC m=+1092.868270465" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.778473 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.937038 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b0a8dd5b-b066-4203-b283-3ff979e8da98-etc-swift\") pod \"b0a8dd5b-b066-4203-b283-3ff979e8da98\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.937115 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-swiftconf\") pod \"b0a8dd5b-b066-4203-b283-3ff979e8da98\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.937215 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-combined-ca-bundle\") pod \"b0a8dd5b-b066-4203-b283-3ff979e8da98\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.937272 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b0a8dd5b-b066-4203-b283-3ff979e8da98-ring-data-devices\") pod \"b0a8dd5b-b066-4203-b283-3ff979e8da98\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.937297 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmw7x\" (UniqueName: \"kubernetes.io/projected/b0a8dd5b-b066-4203-b283-3ff979e8da98-kube-api-access-dmw7x\") pod \"b0a8dd5b-b066-4203-b283-3ff979e8da98\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.937317 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0a8dd5b-b066-4203-b283-3ff979e8da98-scripts\") pod \"b0a8dd5b-b066-4203-b283-3ff979e8da98\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.937824 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-dispersionconf\") pod \"b0a8dd5b-b066-4203-b283-3ff979e8da98\" (UID: \"b0a8dd5b-b066-4203-b283-3ff979e8da98\") " Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.937955 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a8dd5b-b066-4203-b283-3ff979e8da98-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b0a8dd5b-b066-4203-b283-3ff979e8da98" (UID: "b0a8dd5b-b066-4203-b283-3ff979e8da98"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.938077 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0a8dd5b-b066-4203-b283-3ff979e8da98-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b0a8dd5b-b066-4203-b283-3ff979e8da98" (UID: "b0a8dd5b-b066-4203-b283-3ff979e8da98"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.938445 4851 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b0a8dd5b-b066-4203-b283-3ff979e8da98-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.938490 4851 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b0a8dd5b-b066-4203-b283-3ff979e8da98-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.950645 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a8dd5b-b066-4203-b283-3ff979e8da98-kube-api-access-dmw7x" (OuterVolumeSpecName: "kube-api-access-dmw7x") pod "b0a8dd5b-b066-4203-b283-3ff979e8da98" (UID: "b0a8dd5b-b066-4203-b283-3ff979e8da98"). InnerVolumeSpecName "kube-api-access-dmw7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.953292 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b0a8dd5b-b066-4203-b283-3ff979e8da98" (UID: "b0a8dd5b-b066-4203-b283-3ff979e8da98"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.963914 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0a8dd5b-b066-4203-b283-3ff979e8da98" (UID: "b0a8dd5b-b066-4203-b283-3ff979e8da98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.966989 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0a8dd5b-b066-4203-b283-3ff979e8da98-scripts" (OuterVolumeSpecName: "scripts") pod "b0a8dd5b-b066-4203-b283-3ff979e8da98" (UID: "b0a8dd5b-b066-4203-b283-3ff979e8da98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:24 crc kubenswrapper[4851]: I1001 13:11:24.983830 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b0a8dd5b-b066-4203-b283-3ff979e8da98" (UID: "b0a8dd5b-b066-4203-b283-3ff979e8da98"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:11:25 crc kubenswrapper[4851]: I1001 13:11:25.039648 4851 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:25 crc kubenswrapper[4851]: I1001 13:11:25.039689 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:25 crc kubenswrapper[4851]: I1001 13:11:25.039700 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmw7x\" (UniqueName: \"kubernetes.io/projected/b0a8dd5b-b066-4203-b283-3ff979e8da98-kube-api-access-dmw7x\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:25 crc kubenswrapper[4851]: I1001 13:11:25.039710 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0a8dd5b-b066-4203-b283-3ff979e8da98-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:25 crc kubenswrapper[4851]: I1001 13:11:25.039718 4851 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b0a8dd5b-b066-4203-b283-3ff979e8da98-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:25 crc kubenswrapper[4851]: I1001 13:11:25.160996 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:11:25 crc kubenswrapper[4851]: I1001 13:11:25.166871 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dmq5k" podUID="7b15e5b7-162e-46ee-a292-bf763704cda6" containerName="ovn-controller" probeResult="failure" output=< Oct 01 13:11:25 crc kubenswrapper[4851]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 13:11:25 crc kubenswrapper[4851]: > Oct 01 13:11:25 crc kubenswrapper[4851]: I1001 13:11:25.395595 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s5dg8" event={"ID":"b0a8dd5b-b066-4203-b283-3ff979e8da98","Type":"ContainerDied","Data":"1a7009f5fd7480e79ce85bbf014b1ca650543e8efb16dc51093331793bacd242"} Oct 01 13:11:25 crc kubenswrapper[4851]: I1001 13:11:25.395684 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a7009f5fd7480e79ce85bbf014b1ca650543e8efb16dc51093331793bacd242" Oct 01 13:11:25 crc kubenswrapper[4851]: I1001 13:11:25.395747 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s5dg8" Oct 01 13:11:25 crc kubenswrapper[4851]: I1001 13:11:25.829506 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dkxdn" Oct 01 13:11:25 crc kubenswrapper[4851]: I1001 13:11:25.956251 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vzhg\" (UniqueName: \"kubernetes.io/projected/ef457f3c-0b63-4180-a5a7-bbf92e55a195-kube-api-access-5vzhg\") pod \"ef457f3c-0b63-4180-a5a7-bbf92e55a195\" (UID: \"ef457f3c-0b63-4180-a5a7-bbf92e55a195\") " Oct 01 13:11:25 crc kubenswrapper[4851]: I1001 13:11:25.960693 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef457f3c-0b63-4180-a5a7-bbf92e55a195-kube-api-access-5vzhg" (OuterVolumeSpecName: "kube-api-access-5vzhg") pod "ef457f3c-0b63-4180-a5a7-bbf92e55a195" (UID: "ef457f3c-0b63-4180-a5a7-bbf92e55a195"). InnerVolumeSpecName "kube-api-access-5vzhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:25 crc kubenswrapper[4851]: I1001 13:11:25.964255 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-d2a4-account-create-gvkcm" Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.010145 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qtxhk" Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.057493 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tf9r\" (UniqueName: \"kubernetes.io/projected/6b8bb329-e08f-415d-a643-94cffe80471b-kube-api-access-5tf9r\") pod \"6b8bb329-e08f-415d-a643-94cffe80471b\" (UID: \"6b8bb329-e08f-415d-a643-94cffe80471b\") " Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.057910 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vzhg\" (UniqueName: \"kubernetes.io/projected/ef457f3c-0b63-4180-a5a7-bbf92e55a195-kube-api-access-5vzhg\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.060848 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8bb329-e08f-415d-a643-94cffe80471b-kube-api-access-5tf9r" (OuterVolumeSpecName: "kube-api-access-5tf9r") pod "6b8bb329-e08f-415d-a643-94cffe80471b" (UID: "6b8bb329-e08f-415d-a643-94cffe80471b"). InnerVolumeSpecName "kube-api-access-5tf9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.159161 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzrvz\" (UniqueName: \"kubernetes.io/projected/079f3beb-593f-4866-ae54-8e1fb880c9b1-kube-api-access-bzrvz\") pod \"079f3beb-593f-4866-ae54-8e1fb880c9b1\" (UID: \"079f3beb-593f-4866-ae54-8e1fb880c9b1\") " Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.159726 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tf9r\" (UniqueName: \"kubernetes.io/projected/6b8bb329-e08f-415d-a643-94cffe80471b-kube-api-access-5tf9r\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.162375 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079f3beb-593f-4866-ae54-8e1fb880c9b1-kube-api-access-bzrvz" (OuterVolumeSpecName: "kube-api-access-bzrvz") pod "079f3beb-593f-4866-ae54-8e1fb880c9b1" (UID: "079f3beb-593f-4866-ae54-8e1fb880c9b1"). InnerVolumeSpecName "kube-api-access-bzrvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.261899 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzrvz\" (UniqueName: \"kubernetes.io/projected/079f3beb-593f-4866-ae54-8e1fb880c9b1-kube-api-access-bzrvz\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.402720 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-d2a4-account-create-gvkcm" event={"ID":"6b8bb329-e08f-415d-a643-94cffe80471b","Type":"ContainerDied","Data":"12bfc14daf768b6707b56a570ee1814824af37572207b39e033a2a848c474948"} Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.402762 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12bfc14daf768b6707b56a570ee1814824af37572207b39e033a2a848c474948" Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.402776 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-d2a4-account-create-gvkcm" Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.404242 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qtxhk" event={"ID":"079f3beb-593f-4866-ae54-8e1fb880c9b1","Type":"ContainerDied","Data":"fdd62f94b8c12388727bc4aefadfc3a7eab1aaa6b22c346cabae80d2819c3f07"} Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.404264 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd62f94b8c12388727bc4aefadfc3a7eab1aaa6b22c346cabae80d2819c3f07" Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.404248 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qtxhk" Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.405657 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dkxdn" event={"ID":"ef457f3c-0b63-4180-a5a7-bbf92e55a195","Type":"ContainerDied","Data":"1c305fde4d9be3c6dc6d5d73ad6c9ed3bb96644df85b4ae92c0519147a4e2167"} Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.405676 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c305fde4d9be3c6dc6d5d73ad6c9ed3bb96644df85b4ae92c0519147a4e2167" Oct 01 13:11:26 crc kubenswrapper[4851]: I1001 13:11:26.405702 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dkxdn" Oct 01 13:11:28 crc kubenswrapper[4851]: I1001 13:11:28.218819 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.149129 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dmq5k" podUID="7b15e5b7-162e-46ee-a292-bf763704cda6" containerName="ovn-controller" probeResult="failure" output=< Oct 01 13:11:30 crc kubenswrapper[4851]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 13:11:30 crc kubenswrapper[4851]: > Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.157852 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gt7fw" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.397789 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dmq5k-config-72qtf"] Oct 01 13:11:30 crc kubenswrapper[4851]: E1001 13:11:30.398165 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8bb329-e08f-415d-a643-94cffe80471b" containerName="mariadb-account-create" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.398185 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8bb329-e08f-415d-a643-94cffe80471b" containerName="mariadb-account-create" Oct 01 13:11:30 crc kubenswrapper[4851]: E1001 13:11:30.398200 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079f3beb-593f-4866-ae54-8e1fb880c9b1" containerName="mariadb-database-create" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.398210 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="079f3beb-593f-4866-ae54-8e1fb880c9b1" containerName="mariadb-database-create" Oct 01 13:11:30 crc kubenswrapper[4851]: E1001 13:11:30.398227 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a8dd5b-b066-4203-b283-3ff979e8da98" containerName="swift-ring-rebalance" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.398235 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a8dd5b-b066-4203-b283-3ff979e8da98" containerName="swift-ring-rebalance" Oct 01 13:11:30 crc kubenswrapper[4851]: E1001 13:11:30.398251 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef457f3c-0b63-4180-a5a7-bbf92e55a195" containerName="mariadb-database-create" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.398259 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef457f3c-0b63-4180-a5a7-bbf92e55a195" containerName="mariadb-database-create" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.398463 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="079f3beb-593f-4866-ae54-8e1fb880c9b1" containerName="mariadb-database-create" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.398479 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8bb329-e08f-415d-a643-94cffe80471b" containerName="mariadb-account-create" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.398526 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef457f3c-0b63-4180-a5a7-bbf92e55a195" containerName="mariadb-database-create" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.398543 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a8dd5b-b066-4203-b283-3ff979e8da98" containerName="swift-ring-rebalance" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.399191 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.402198 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.413595 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dmq5k-config-72qtf"] Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.536703 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1981a68a-64c9-4961-ae39-18b562e60681-scripts\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.536762 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-run-ovn\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.536802 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1981a68a-64c9-4961-ae39-18b562e60681-additional-scripts\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.536882 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-run\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.537164 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pzw2\" (UniqueName: \"kubernetes.io/projected/1981a68a-64c9-4961-ae39-18b562e60681-kube-api-access-9pzw2\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.537217 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-log-ovn\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.639249 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1981a68a-64c9-4961-ae39-18b562e60681-scripts\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.639308 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-run-ovn\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.639344 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1981a68a-64c9-4961-ae39-18b562e60681-additional-scripts\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.639379 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-run\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.639469 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pzw2\" (UniqueName: \"kubernetes.io/projected/1981a68a-64c9-4961-ae39-18b562e60681-kube-api-access-9pzw2\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.639529 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-log-ovn\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.639871 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-log-ovn\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.640882 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-run\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.640907 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-run-ovn\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.641266 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1981a68a-64c9-4961-ae39-18b562e60681-additional-scripts\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.643576 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1981a68a-64c9-4961-ae39-18b562e60681-scripts\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.668725 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pzw2\" (UniqueName: \"kubernetes.io/projected/1981a68a-64c9-4961-ae39-18b562e60681-kube-api-access-9pzw2\") pod \"ovn-controller-dmq5k-config-72qtf\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:30 crc kubenswrapper[4851]: I1001 13:11:30.723425 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:31 crc kubenswrapper[4851]: I1001 13:11:31.212228 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dmq5k-config-72qtf"] Oct 01 13:11:31 crc kubenswrapper[4851]: W1001 13:11:31.214366 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1981a68a_64c9_4961_ae39_18b562e60681.slice/crio-705ee754fa8830cf7080d31dbfb6d9b5994fe2cbefbcf86ea0c848b099755a5e WatchSource:0}: Error finding container 705ee754fa8830cf7080d31dbfb6d9b5994fe2cbefbcf86ea0c848b099755a5e: Status 404 returned error can't find the container with id 705ee754fa8830cf7080d31dbfb6d9b5994fe2cbefbcf86ea0c848b099755a5e Oct 01 13:11:31 crc kubenswrapper[4851]: I1001 13:11:31.455657 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dmq5k-config-72qtf" event={"ID":"1981a68a-64c9-4961-ae39-18b562e60681","Type":"ContainerStarted","Data":"705ee754fa8830cf7080d31dbfb6d9b5994fe2cbefbcf86ea0c848b099755a5e"} Oct 01 13:11:32 crc kubenswrapper[4851]: I1001 13:11:32.487088 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dmq5k-config-72qtf" event={"ID":"1981a68a-64c9-4961-ae39-18b562e60681","Type":"ContainerDied","Data":"a732157d5877bbcdd56e964be87aa9441d7057316378fd23aa232f972176764b"} Oct 01 13:11:32 crc kubenswrapper[4851]: I1001 13:11:32.487575 4851 generic.go:334] "Generic (PLEG): container finished" podID="1981a68a-64c9-4961-ae39-18b562e60681" containerID="a732157d5877bbcdd56e964be87aa9441d7057316378fd23aa232f972176764b" exitCode=0 Oct 01 13:11:33 crc kubenswrapper[4851]: I1001 13:11:33.922019 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.009933 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-log-ovn\") pod \"1981a68a-64c9-4961-ae39-18b562e60681\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.010083 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1981a68a-64c9-4961-ae39-18b562e60681-additional-scripts\") pod \"1981a68a-64c9-4961-ae39-18b562e60681\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.010109 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pzw2\" (UniqueName: \"kubernetes.io/projected/1981a68a-64c9-4961-ae39-18b562e60681-kube-api-access-9pzw2\") pod \"1981a68a-64c9-4961-ae39-18b562e60681\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.010181 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-run-ovn\") pod \"1981a68a-64c9-4961-ae39-18b562e60681\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.010170 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1981a68a-64c9-4961-ae39-18b562e60681" (UID: "1981a68a-64c9-4961-ae39-18b562e60681"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.010199 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1981a68a-64c9-4961-ae39-18b562e60681-scripts\") pod \"1981a68a-64c9-4961-ae39-18b562e60681\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.010347 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-run\") pod \"1981a68a-64c9-4961-ae39-18b562e60681\" (UID: \"1981a68a-64c9-4961-ae39-18b562e60681\") " Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.010835 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1981a68a-64c9-4961-ae39-18b562e60681" (UID: "1981a68a-64c9-4961-ae39-18b562e60681"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.010901 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-run" (OuterVolumeSpecName: "var-run") pod "1981a68a-64c9-4961-ae39-18b562e60681" (UID: "1981a68a-64c9-4961-ae39-18b562e60681"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.011264 4851 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-run\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.011297 4851 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.011314 4851 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1981a68a-64c9-4961-ae39-18b562e60681-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.011877 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1981a68a-64c9-4961-ae39-18b562e60681-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1981a68a-64c9-4961-ae39-18b562e60681" (UID: "1981a68a-64c9-4961-ae39-18b562e60681"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.012052 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1981a68a-64c9-4961-ae39-18b562e60681-scripts" (OuterVolumeSpecName: "scripts") pod "1981a68a-64c9-4961-ae39-18b562e60681" (UID: "1981a68a-64c9-4961-ae39-18b562e60681"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.018391 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1981a68a-64c9-4961-ae39-18b562e60681-kube-api-access-9pzw2" (OuterVolumeSpecName: "kube-api-access-9pzw2") pod "1981a68a-64c9-4961-ae39-18b562e60681" (UID: "1981a68a-64c9-4961-ae39-18b562e60681"). InnerVolumeSpecName "kube-api-access-9pzw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.112948 4851 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1981a68a-64c9-4961-ae39-18b562e60681-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.113010 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pzw2\" (UniqueName: \"kubernetes.io/projected/1981a68a-64c9-4961-ae39-18b562e60681-kube-api-access-9pzw2\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.113025 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1981a68a-64c9-4961-ae39-18b562e60681-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.503629 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dmq5k-config-72qtf" event={"ID":"1981a68a-64c9-4961-ae39-18b562e60681","Type":"ContainerDied","Data":"705ee754fa8830cf7080d31dbfb6d9b5994fe2cbefbcf86ea0c848b099755a5e"} Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.503664 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="705ee754fa8830cf7080d31dbfb6d9b5994fe2cbefbcf86ea0c848b099755a5e" Oct 01 13:11:34 crc kubenswrapper[4851]: I1001 13:11:34.503695 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dmq5k-config-72qtf" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.070130 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dmq5k-config-72qtf"] Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.082061 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dmq5k-config-72qtf"] Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.129208 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.139087 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a4d7762-af01-4c7e-9641-4b2054a8885d-etc-swift\") pod \"swift-storage-0\" (UID: \"6a4d7762-af01-4c7e-9641-4b2054a8885d\") " pod="openstack/swift-storage-0" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.150160 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-dmq5k" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.275248 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.387392 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-32f8-account-create-7mdj5"] Oct 01 13:11:35 crc kubenswrapper[4851]: E1001 13:11:35.388011 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1981a68a-64c9-4961-ae39-18b562e60681" containerName="ovn-config" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.388023 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1981a68a-64c9-4961-ae39-18b562e60681" containerName="ovn-config" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.388182 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="1981a68a-64c9-4961-ae39-18b562e60681" containerName="ovn-config" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.388700 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-32f8-account-create-7mdj5" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.391176 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.458241 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-32f8-account-create-7mdj5"] Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.535615 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chk77\" (UniqueName: \"kubernetes.io/projected/45d8621f-665c-42d4-81bd-43afeadc7b7d-kube-api-access-chk77\") pod \"glance-32f8-account-create-7mdj5\" (UID: \"45d8621f-665c-42d4-81bd-43afeadc7b7d\") " pod="openstack/glance-32f8-account-create-7mdj5" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.636891 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chk77\" (UniqueName: \"kubernetes.io/projected/45d8621f-665c-42d4-81bd-43afeadc7b7d-kube-api-access-chk77\") pod \"glance-32f8-account-create-7mdj5\" (UID: \"45d8621f-665c-42d4-81bd-43afeadc7b7d\") " pod="openstack/glance-32f8-account-create-7mdj5" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.655550 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chk77\" (UniqueName: \"kubernetes.io/projected/45d8621f-665c-42d4-81bd-43afeadc7b7d-kube-api-access-chk77\") pod \"glance-32f8-account-create-7mdj5\" (UID: \"45d8621f-665c-42d4-81bd-43afeadc7b7d\") " pod="openstack/glance-32f8-account-create-7mdj5" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.705876 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-32f8-account-create-7mdj5" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.848629 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="95c53639-696f-4d10-a297-7173dd3b394f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Oct 01 13:11:35 crc kubenswrapper[4851]: I1001 13:11:35.954758 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 01 13:11:35 crc kubenswrapper[4851]: W1001 13:11:35.966374 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a4d7762_af01_4c7e_9641_4b2054a8885d.slice/crio-01cc481130a509d1291c3b473a08156446f7a2de5aab6b3bd93307c2f5f8e101 WatchSource:0}: Error finding container 01cc481130a509d1291c3b473a08156446f7a2de5aab6b3bd93307c2f5f8e101: Status 404 returned error can't find the container with id 01cc481130a509d1291c3b473a08156446f7a2de5aab6b3bd93307c2f5f8e101 Oct 01 13:11:36 crc kubenswrapper[4851]: I1001 13:11:36.149048 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-32f8-account-create-7mdj5"] Oct 01 13:11:36 crc kubenswrapper[4851]: W1001 13:11:36.159032 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45d8621f_665c_42d4_81bd_43afeadc7b7d.slice/crio-70a7eee8ed4717af6abc33793065070683023d9ee7e4069da08dcf843f82610f WatchSource:0}: Error finding container 70a7eee8ed4717af6abc33793065070683023d9ee7e4069da08dcf843f82610f: Status 404 returned error can't find the container with id 70a7eee8ed4717af6abc33793065070683023d9ee7e4069da08dcf843f82610f Oct 01 13:11:36 crc kubenswrapper[4851]: I1001 13:11:36.189160 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Oct 01 13:11:36 crc kubenswrapper[4851]: I1001 13:11:36.344993 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1981a68a-64c9-4961-ae39-18b562e60681" path="/var/lib/kubelet/pods/1981a68a-64c9-4961-ae39-18b562e60681/volumes" Oct 01 13:11:36 crc kubenswrapper[4851]: I1001 13:11:36.406868 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="712c3704-a775-4fac-81d6-9aa9cfdc48ef" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Oct 01 13:11:36 crc kubenswrapper[4851]: I1001 13:11:36.520784 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"01cc481130a509d1291c3b473a08156446f7a2de5aab6b3bd93307c2f5f8e101"} Oct 01 13:11:36 crc kubenswrapper[4851]: I1001 13:11:36.522263 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-32f8-account-create-7mdj5" event={"ID":"45d8621f-665c-42d4-81bd-43afeadc7b7d","Type":"ContainerStarted","Data":"70a7eee8ed4717af6abc33793065070683023d9ee7e4069da08dcf843f82610f"} Oct 01 13:11:37 crc kubenswrapper[4851]: I1001 13:11:37.534368 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"6bcc1af6aae90bebd5b5576e5e882767879be694574c8e764c61abaaf5968a9b"} Oct 01 13:11:37 crc kubenswrapper[4851]: I1001 13:11:37.534790 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"a0848767300a71b1cd72e5d9b0eb1657ac4c23b7d7c6e9a8625f361913edd0ed"} Oct 01 13:11:37 crc kubenswrapper[4851]: I1001 13:11:37.536278 4851 generic.go:334] "Generic (PLEG): container finished" podID="45d8621f-665c-42d4-81bd-43afeadc7b7d" containerID="806a9baf4982bba0142b95d4e13a60fadb66cba456b3ea3252115f3cc6877ede" exitCode=0 Oct 01 13:11:37 crc kubenswrapper[4851]: I1001 13:11:37.536307 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-32f8-account-create-7mdj5" event={"ID":"45d8621f-665c-42d4-81bd-43afeadc7b7d","Type":"ContainerDied","Data":"806a9baf4982bba0142b95d4e13a60fadb66cba456b3ea3252115f3cc6877ede"} Oct 01 13:11:38 crc kubenswrapper[4851]: I1001 13:11:38.218307 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:38 crc kubenswrapper[4851]: I1001 13:11:38.222875 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:38 crc kubenswrapper[4851]: I1001 13:11:38.554216 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"8d648e2ba225131c92a862d2dc4537ee0c7a114eb98a1ab4a20aba6d95ad2ffa"} Oct 01 13:11:38 crc kubenswrapper[4851]: I1001 13:11:38.554818 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"34079b91d45085895ac6c4ba1fb95e3421d96008b841de585bdff612dd2fb197"} Oct 01 13:11:38 crc kubenswrapper[4851]: I1001 13:11:38.554838 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"0fcdf375af267086aa2ed9bc88b479250902e079c841312ccff13ecaea438c78"} Oct 01 13:11:38 crc kubenswrapper[4851]: I1001 13:11:38.556378 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:38 crc kubenswrapper[4851]: I1001 13:11:38.915757 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-32f8-account-create-7mdj5" Oct 01 13:11:38 crc kubenswrapper[4851]: I1001 13:11:38.996215 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chk77\" (UniqueName: \"kubernetes.io/projected/45d8621f-665c-42d4-81bd-43afeadc7b7d-kube-api-access-chk77\") pod \"45d8621f-665c-42d4-81bd-43afeadc7b7d\" (UID: \"45d8621f-665c-42d4-81bd-43afeadc7b7d\") " Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.001963 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d8621f-665c-42d4-81bd-43afeadc7b7d-kube-api-access-chk77" (OuterVolumeSpecName: "kube-api-access-chk77") pod "45d8621f-665c-42d4-81bd-43afeadc7b7d" (UID: "45d8621f-665c-42d4-81bd-43afeadc7b7d"). InnerVolumeSpecName "kube-api-access-chk77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.098279 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chk77\" (UniqueName: \"kubernetes.io/projected/45d8621f-665c-42d4-81bd-43afeadc7b7d-kube-api-access-chk77\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.600864 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"069b52cc6b920f20f8064c6b6aad56a031863386200019deef920be339f4d996"} Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.601068 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"a42cdd88605f155224b96d3094e0b8b74880cb7d76c28d8f100b6392cda3490c"} Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.601081 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"b845a20f076a0346ecad2adbfdf43618b6b1aacfe4cf6f0ef50c453d666881de"} Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.607190 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-32f8-account-create-7mdj5" Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.607360 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-32f8-account-create-7mdj5" event={"ID":"45d8621f-665c-42d4-81bd-43afeadc7b7d","Type":"ContainerDied","Data":"70a7eee8ed4717af6abc33793065070683023d9ee7e4069da08dcf843f82610f"} Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.607458 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70a7eee8ed4717af6abc33793065070683023d9ee7e4069da08dcf843f82610f" Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.769057 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8535-account-create-j68t5"] Oct 01 13:11:39 crc kubenswrapper[4851]: E1001 13:11:39.769638 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d8621f-665c-42d4-81bd-43afeadc7b7d" containerName="mariadb-account-create" Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.769716 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d8621f-665c-42d4-81bd-43afeadc7b7d" containerName="mariadb-account-create" Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.769948 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d8621f-665c-42d4-81bd-43afeadc7b7d" containerName="mariadb-account-create" Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.770579 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8535-account-create-j68t5" Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.772960 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.777858 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8535-account-create-j68t5"] Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.910746 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4kws\" (UniqueName: \"kubernetes.io/projected/700c50aa-a63d-44ea-8c51-4b76771caf18-kube-api-access-c4kws\") pod \"keystone-8535-account-create-j68t5\" (UID: \"700c50aa-a63d-44ea-8c51-4b76771caf18\") " pod="openstack/keystone-8535-account-create-j68t5" Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.998398 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-96c5-account-create-rqjrx"] Oct 01 13:11:39 crc kubenswrapper[4851]: I1001 13:11:39.999471 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-96c5-account-create-rqjrx" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.005798 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.006716 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-96c5-account-create-rqjrx"] Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.013073 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4kws\" (UniqueName: \"kubernetes.io/projected/700c50aa-a63d-44ea-8c51-4b76771caf18-kube-api-access-c4kws\") pod \"keystone-8535-account-create-j68t5\" (UID: \"700c50aa-a63d-44ea-8c51-4b76771caf18\") " pod="openstack/keystone-8535-account-create-j68t5" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.072130 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4kws\" (UniqueName: \"kubernetes.io/projected/700c50aa-a63d-44ea-8c51-4b76771caf18-kube-api-access-c4kws\") pod \"keystone-8535-account-create-j68t5\" (UID: \"700c50aa-a63d-44ea-8c51-4b76771caf18\") " pod="openstack/keystone-8535-account-create-j68t5" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.086566 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8535-account-create-j68t5" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.114046 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5wxr\" (UniqueName: \"kubernetes.io/projected/a661d676-d45b-4cf6-8b29-31c28fb2c46b-kube-api-access-q5wxr\") pod \"placement-96c5-account-create-rqjrx\" (UID: \"a661d676-d45b-4cf6-8b29-31c28fb2c46b\") " pod="openstack/placement-96c5-account-create-rqjrx" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.215755 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5wxr\" (UniqueName: \"kubernetes.io/projected/a661d676-d45b-4cf6-8b29-31c28fb2c46b-kube-api-access-q5wxr\") pod \"placement-96c5-account-create-rqjrx\" (UID: \"a661d676-d45b-4cf6-8b29-31c28fb2c46b\") " pod="openstack/placement-96c5-account-create-rqjrx" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.233125 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5wxr\" (UniqueName: \"kubernetes.io/projected/a661d676-d45b-4cf6-8b29-31c28fb2c46b-kube-api-access-q5wxr\") pod \"placement-96c5-account-create-rqjrx\" (UID: \"a661d676-d45b-4cf6-8b29-31c28fb2c46b\") " pod="openstack/placement-96c5-account-create-rqjrx" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.320186 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-96c5-account-create-rqjrx" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.594634 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hgsvg"] Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.596411 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hgsvg" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.598626 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.600411 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-q4nbb" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.605721 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hgsvg"] Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.682275 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.682686 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerName="prometheus" containerID="cri-o://5d3a504e2fc801c80023dc63dd412e85d75002d50155ac11fc8476348268197a" gracePeriod=600 Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.682905 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerName="config-reloader" containerID="cri-o://f05fb51d2a98fdac4d8fd593f16ceb782f33265a66248db8be1c3c06402958b1" gracePeriod=600 Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.682899 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerName="thanos-sidecar" containerID="cri-o://46100b329bd535edf0d5df7c0f0343220da44cc5bdb743cf098c51aa229456e0" gracePeriod=600 Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.700773 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-96c5-account-create-rqjrx"] Oct 01 13:11:40 crc kubenswrapper[4851]: W1001 13:11:40.704355 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda661d676_d45b_4cf6_8b29_31c28fb2c46b.slice/crio-52ba9ee03f5240d43a100b796cb1ec5fd6305df0fb783b0c27b18d4be65a9ad4 WatchSource:0}: Error finding container 52ba9ee03f5240d43a100b796cb1ec5fd6305df0fb783b0c27b18d4be65a9ad4: Status 404 returned error can't find the container with id 52ba9ee03f5240d43a100b796cb1ec5fd6305df0fb783b0c27b18d4be65a9ad4 Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.723875 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-combined-ca-bundle\") pod \"glance-db-sync-hgsvg\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " pod="openstack/glance-db-sync-hgsvg" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.724209 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95cl5\" (UniqueName: \"kubernetes.io/projected/794e875d-440e-4a04-acf1-61b1e63c57d7-kube-api-access-95cl5\") pod \"glance-db-sync-hgsvg\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " pod="openstack/glance-db-sync-hgsvg" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.724307 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-config-data\") pod \"glance-db-sync-hgsvg\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " pod="openstack/glance-db-sync-hgsvg" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.724385 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-db-sync-config-data\") pod \"glance-db-sync-hgsvg\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " pod="openstack/glance-db-sync-hgsvg" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.826056 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95cl5\" (UniqueName: \"kubernetes.io/projected/794e875d-440e-4a04-acf1-61b1e63c57d7-kube-api-access-95cl5\") pod \"glance-db-sync-hgsvg\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " pod="openstack/glance-db-sync-hgsvg" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.826156 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-config-data\") pod \"glance-db-sync-hgsvg\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " pod="openstack/glance-db-sync-hgsvg" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.826197 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-db-sync-config-data\") pod \"glance-db-sync-hgsvg\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " pod="openstack/glance-db-sync-hgsvg" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.826290 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-combined-ca-bundle\") pod \"glance-db-sync-hgsvg\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " pod="openstack/glance-db-sync-hgsvg" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.831735 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-combined-ca-bundle\") pod \"glance-db-sync-hgsvg\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " pod="openstack/glance-db-sync-hgsvg" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.834700 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-config-data\") pod \"glance-db-sync-hgsvg\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " pod="openstack/glance-db-sync-hgsvg" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.836633 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-db-sync-config-data\") pod \"glance-db-sync-hgsvg\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " pod="openstack/glance-db-sync-hgsvg" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.843740 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95cl5\" (UniqueName: \"kubernetes.io/projected/794e875d-440e-4a04-acf1-61b1e63c57d7-kube-api-access-95cl5\") pod \"glance-db-sync-hgsvg\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " pod="openstack/glance-db-sync-hgsvg" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.933655 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hgsvg" Oct 01 13:11:40 crc kubenswrapper[4851]: I1001 13:11:40.980070 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8535-account-create-j68t5"] Oct 01 13:11:40 crc kubenswrapper[4851]: W1001 13:11:40.991530 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod700c50aa_a63d_44ea_8c51_4b76771caf18.slice/crio-9cd378bbe4b2fb3154c98c4468f37f7f0d54ca7213177d5bc5971bca973613d9 WatchSource:0}: Error finding container 9cd378bbe4b2fb3154c98c4468f37f7f0d54ca7213177d5bc5971bca973613d9: Status 404 returned error can't find the container with id 9cd378bbe4b2fb3154c98c4468f37f7f0d54ca7213177d5bc5971bca973613d9 Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.487549 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hgsvg"] Oct 01 13:11:41 crc kubenswrapper[4851]: W1001 13:11:41.505391 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod794e875d_440e_4a04_acf1_61b1e63c57d7.slice/crio-790fedc8504529909c8fb9bd9937c173c0024321eb94789835a0ad3ff8bb6f04 WatchSource:0}: Error finding container 790fedc8504529909c8fb9bd9937c173c0024321eb94789835a0ad3ff8bb6f04: Status 404 returned error can't find the container with id 790fedc8504529909c8fb9bd9937c173c0024321eb94789835a0ad3ff8bb6f04 Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.644686 4851 generic.go:334] "Generic (PLEG): container finished" podID="700c50aa-a63d-44ea-8c51-4b76771caf18" containerID="bb6012633397d49cb42e2d31280d52b95ec2197e87f5deba1901222f8609e405" exitCode=0 Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.644790 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8535-account-create-j68t5" event={"ID":"700c50aa-a63d-44ea-8c51-4b76771caf18","Type":"ContainerDied","Data":"bb6012633397d49cb42e2d31280d52b95ec2197e87f5deba1901222f8609e405"} Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.644842 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8535-account-create-j68t5" event={"ID":"700c50aa-a63d-44ea-8c51-4b76771caf18","Type":"ContainerStarted","Data":"9cd378bbe4b2fb3154c98c4468f37f7f0d54ca7213177d5bc5971bca973613d9"} Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.656709 4851 generic.go:334] "Generic (PLEG): container finished" podID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerID="46100b329bd535edf0d5df7c0f0343220da44cc5bdb743cf098c51aa229456e0" exitCode=0 Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.657012 4851 generic.go:334] "Generic (PLEG): container finished" podID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerID="f05fb51d2a98fdac4d8fd593f16ceb782f33265a66248db8be1c3c06402958b1" exitCode=0 Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.657028 4851 generic.go:334] "Generic (PLEG): container finished" podID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerID="5d3a504e2fc801c80023dc63dd412e85d75002d50155ac11fc8476348268197a" exitCode=0 Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.657520 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"486b3f7c-3593-4950-b470-2d0a2f037e2f","Type":"ContainerDied","Data":"46100b329bd535edf0d5df7c0f0343220da44cc5bdb743cf098c51aa229456e0"} Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.657924 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"486b3f7c-3593-4950-b470-2d0a2f037e2f","Type":"ContainerDied","Data":"f05fb51d2a98fdac4d8fd593f16ceb782f33265a66248db8be1c3c06402958b1"} Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.658082 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"486b3f7c-3593-4950-b470-2d0a2f037e2f","Type":"ContainerDied","Data":"5d3a504e2fc801c80023dc63dd412e85d75002d50155ac11fc8476348268197a"} Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.671137 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"df1f55b12f1f7f89d06ddadf04c8cd0ed10f681cae700e00d9ad2eb990d24a5d"} Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.671170 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"3b0a51375a02aaeafb8973b97eefb26884296efbcc838194536aea423f711c37"} Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.671180 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"53d017df787a3c8941f1f88d41a69602ac5a2b6139453e5897df72aa8dfe2dce"} Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.671190 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"b9a5a79036cf99bd6ec17f20bca72df2c02ff8efc3829488b7c7e47b7f305c4e"} Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.675913 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hgsvg" event={"ID":"794e875d-440e-4a04-acf1-61b1e63c57d7","Type":"ContainerStarted","Data":"790fedc8504529909c8fb9bd9937c173c0024321eb94789835a0ad3ff8bb6f04"} Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.684348 4851 generic.go:334] "Generic (PLEG): container finished" podID="a661d676-d45b-4cf6-8b29-31c28fb2c46b" containerID="8b223680f1ef1c92742587cc02f36c5e1b66ce8fa77114d8927959d98d39143a" exitCode=0 Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.684400 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-96c5-account-create-rqjrx" event={"ID":"a661d676-d45b-4cf6-8b29-31c28fb2c46b","Type":"ContainerDied","Data":"8b223680f1ef1c92742587cc02f36c5e1b66ce8fa77114d8927959d98d39143a"} Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.684423 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-96c5-account-create-rqjrx" event={"ID":"a661d676-d45b-4cf6-8b29-31c28fb2c46b","Type":"ContainerStarted","Data":"52ba9ee03f5240d43a100b796cb1ec5fd6305df0fb783b0c27b18d4be65a9ad4"} Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.819387 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.950058 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h97sc\" (UniqueName: \"kubernetes.io/projected/486b3f7c-3593-4950-b470-2d0a2f037e2f-kube-api-access-h97sc\") pod \"486b3f7c-3593-4950-b470-2d0a2f037e2f\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.950559 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-config\") pod \"486b3f7c-3593-4950-b470-2d0a2f037e2f\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.950729 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") pod \"486b3f7c-3593-4950-b470-2d0a2f037e2f\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.950784 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-thanos-prometheus-http-client-file\") pod \"486b3f7c-3593-4950-b470-2d0a2f037e2f\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.950832 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/486b3f7c-3593-4950-b470-2d0a2f037e2f-tls-assets\") pod \"486b3f7c-3593-4950-b470-2d0a2f037e2f\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.950854 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/486b3f7c-3593-4950-b470-2d0a2f037e2f-prometheus-metric-storage-rulefiles-0\") pod \"486b3f7c-3593-4950-b470-2d0a2f037e2f\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.950915 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-web-config\") pod \"486b3f7c-3593-4950-b470-2d0a2f037e2f\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.950979 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/486b3f7c-3593-4950-b470-2d0a2f037e2f-config-out\") pod \"486b3f7c-3593-4950-b470-2d0a2f037e2f\" (UID: \"486b3f7c-3593-4950-b470-2d0a2f037e2f\") " Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.954195 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/486b3f7c-3593-4950-b470-2d0a2f037e2f-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "486b3f7c-3593-4950-b470-2d0a2f037e2f" (UID: "486b3f7c-3593-4950-b470-2d0a2f037e2f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.957270 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/486b3f7c-3593-4950-b470-2d0a2f037e2f-config-out" (OuterVolumeSpecName: "config-out") pod "486b3f7c-3593-4950-b470-2d0a2f037e2f" (UID: "486b3f7c-3593-4950-b470-2d0a2f037e2f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.962179 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486b3f7c-3593-4950-b470-2d0a2f037e2f-kube-api-access-h97sc" (OuterVolumeSpecName: "kube-api-access-h97sc") pod "486b3f7c-3593-4950-b470-2d0a2f037e2f" (UID: "486b3f7c-3593-4950-b470-2d0a2f037e2f"). InnerVolumeSpecName "kube-api-access-h97sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.963184 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486b3f7c-3593-4950-b470-2d0a2f037e2f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "486b3f7c-3593-4950-b470-2d0a2f037e2f" (UID: "486b3f7c-3593-4950-b470-2d0a2f037e2f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.964617 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "486b3f7c-3593-4950-b470-2d0a2f037e2f" (UID: "486b3f7c-3593-4950-b470-2d0a2f037e2f"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.985466 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "486b3f7c-3593-4950-b470-2d0a2f037e2f" (UID: "486b3f7c-3593-4950-b470-2d0a2f037e2f"). InnerVolumeSpecName "pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 13:11:41 crc kubenswrapper[4851]: I1001 13:11:41.995687 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-config" (OuterVolumeSpecName: "config") pod "486b3f7c-3593-4950-b470-2d0a2f037e2f" (UID: "486b3f7c-3593-4950-b470-2d0a2f037e2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.001785 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-web-config" (OuterVolumeSpecName: "web-config") pod "486b3f7c-3593-4950-b470-2d0a2f037e2f" (UID: "486b3f7c-3593-4950-b470-2d0a2f037e2f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.053094 4851 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-web-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.053372 4851 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/486b3f7c-3593-4950-b470-2d0a2f037e2f-config-out\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.053444 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h97sc\" (UniqueName: \"kubernetes.io/projected/486b3f7c-3593-4950-b470-2d0a2f037e2f-kube-api-access-h97sc\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.053515 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.053602 4851 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") on node \"crc\" " Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.053685 4851 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/486b3f7c-3593-4950-b470-2d0a2f037e2f-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.053749 4851 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/486b3f7c-3593-4950-b470-2d0a2f037e2f-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.053804 4851 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/486b3f7c-3593-4950-b470-2d0a2f037e2f-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.125057 4851 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.125223 4851 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc") on node "crc" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.156654 4851 reconciler_common.go:293] "Volume detached for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.695895 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"486b3f7c-3593-4950-b470-2d0a2f037e2f","Type":"ContainerDied","Data":"fc55e2cc6f77556bb8856c85cf003505683d767035c8b0cef7bbdc828998e098"} Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.696248 4851 scope.go:117] "RemoveContainer" containerID="46100b329bd535edf0d5df7c0f0343220da44cc5bdb743cf098c51aa229456e0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.695985 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.708942 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"023b8a26bca130b7f806ebc9918c7c14ee1d09a78a128362594fd4776467a0e0"} Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.708985 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"c7351a76ca9f5e12287e7a8411ba1a9094906dbfd9b15021b8d888796cf746b0"} Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.708996 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a4d7762-af01-4c7e-9641-4b2054a8885d","Type":"ContainerStarted","Data":"71de62f86df993f0baf45ed6e0435f95f5f1ade235e6881364005c8064e6bbd2"} Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.731999 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.734429 4851 scope.go:117] "RemoveContainer" containerID="f05fb51d2a98fdac4d8fd593f16ceb782f33265a66248db8be1c3c06402958b1" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.738146 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.756318 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:11:42 crc kubenswrapper[4851]: E1001 13:11:42.756660 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerName="prometheus" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.756677 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerName="prometheus" Oct 01 13:11:42 crc kubenswrapper[4851]: E1001 13:11:42.756694 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerName="thanos-sidecar" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.756700 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerName="thanos-sidecar" Oct 01 13:11:42 crc kubenswrapper[4851]: E1001 13:11:42.756716 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerName="init-config-reloader" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.756722 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerName="init-config-reloader" Oct 01 13:11:42 crc kubenswrapper[4851]: E1001 13:11:42.756735 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerName="config-reloader" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.756741 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerName="config-reloader" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.756927 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerName="prometheus" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.756946 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerName="config-reloader" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.756963 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" containerName="thanos-sidecar" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.758364 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.761524 4851 scope.go:117] "RemoveContainer" containerID="5d3a504e2fc801c80023dc63dd412e85d75002d50155ac11fc8476348268197a" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.762295 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.762625 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-vmjkn" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.763384 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.763515 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.763612 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.767910 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.770707 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.787549 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.796057 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.324776793 podStartE2EDuration="40.796039315s" podCreationTimestamp="2025-10-01 13:11:02 +0000 UTC" firstStartedPulling="2025-10-01 13:11:35.969027987 +0000 UTC m=+1104.314145473" lastFinishedPulling="2025-10-01 13:11:40.440290509 +0000 UTC m=+1108.785407995" observedRunningTime="2025-10-01 13:11:42.774788979 +0000 UTC m=+1111.119906465" watchObservedRunningTime="2025-10-01 13:11:42.796039315 +0000 UTC m=+1111.141156821" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.817537 4851 scope.go:117] "RemoveContainer" containerID="e23df47d3fab5210b11bee1ab4f89352721626bac74ccd2d9dd929430f568032" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.870762 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.870802 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-config\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.870838 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.870885 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.870911 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.870928 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cde88b6a-b10f-4282-9f02-ad48a766a911-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.871235 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr88m\" (UniqueName: \"kubernetes.io/projected/cde88b6a-b10f-4282-9f02-ad48a766a911-kube-api-access-wr88m\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.871263 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cde88b6a-b10f-4282-9f02-ad48a766a911-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.871293 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.871316 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.871335 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cde88b6a-b10f-4282-9f02-ad48a766a911-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.974234 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.974341 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.974401 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.974426 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cde88b6a-b10f-4282-9f02-ad48a766a911-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.974472 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr88m\" (UniqueName: \"kubernetes.io/projected/cde88b6a-b10f-4282-9f02-ad48a766a911-kube-api-access-wr88m\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.974489 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cde88b6a-b10f-4282-9f02-ad48a766a911-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.974530 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.974556 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.974573 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cde88b6a-b10f-4282-9f02-ad48a766a911-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.974648 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.974683 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-config\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.976800 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cde88b6a-b10f-4282-9f02-ad48a766a911-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.979307 4851 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.979341 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f6a0e30b7a50d8862a9481345085f62592a4f2276fdfe80014a12770adb24140/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.993406 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-config\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.993979 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cde88b6a-b10f-4282-9f02-ad48a766a911-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.994358 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.994602 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.994984 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cde88b6a-b10f-4282-9f02-ad48a766a911-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.996279 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:42 crc kubenswrapper[4851]: I1001 13:11:42.999754 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.002899 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.005316 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr88m\" (UniqueName: \"kubernetes.io/projected/cde88b6a-b10f-4282-9f02-ad48a766a911-kube-api-access-wr88m\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.086827 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b6747f4f9-jm2gc"] Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.089437 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.096654 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.113363 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6747f4f9-jm2gc"] Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.139611 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") pod \"prometheus-metric-storage-0\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.166457 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-96c5-account-create-rqjrx" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.178366 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.178418 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-dns-svc\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.178513 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-config\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.178532 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-dns-swift-storage-0\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.178570 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgd8k\" (UniqueName: \"kubernetes.io/projected/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-kube-api-access-vgd8k\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.178594 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.180777 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8535-account-create-j68t5" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.280072 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4kws\" (UniqueName: \"kubernetes.io/projected/700c50aa-a63d-44ea-8c51-4b76771caf18-kube-api-access-c4kws\") pod \"700c50aa-a63d-44ea-8c51-4b76771caf18\" (UID: \"700c50aa-a63d-44ea-8c51-4b76771caf18\") " Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.280357 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5wxr\" (UniqueName: \"kubernetes.io/projected/a661d676-d45b-4cf6-8b29-31c28fb2c46b-kube-api-access-q5wxr\") pod \"a661d676-d45b-4cf6-8b29-31c28fb2c46b\" (UID: \"a661d676-d45b-4cf6-8b29-31c28fb2c46b\") " Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.280633 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-config\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.280751 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-dns-swift-storage-0\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.280914 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgd8k\" (UniqueName: \"kubernetes.io/projected/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-kube-api-access-vgd8k\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.281034 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.281130 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.281219 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-dns-svc\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.281419 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-config\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.281542 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-dns-swift-storage-0\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.281747 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.282131 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.282226 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-dns-svc\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.286174 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a661d676-d45b-4cf6-8b29-31c28fb2c46b-kube-api-access-q5wxr" (OuterVolumeSpecName: "kube-api-access-q5wxr") pod "a661d676-d45b-4cf6-8b29-31c28fb2c46b" (UID: "a661d676-d45b-4cf6-8b29-31c28fb2c46b"). InnerVolumeSpecName "kube-api-access-q5wxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.287685 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700c50aa-a63d-44ea-8c51-4b76771caf18-kube-api-access-c4kws" (OuterVolumeSpecName: "kube-api-access-c4kws") pod "700c50aa-a63d-44ea-8c51-4b76771caf18" (UID: "700c50aa-a63d-44ea-8c51-4b76771caf18"). InnerVolumeSpecName "kube-api-access-c4kws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.303752 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgd8k\" (UniqueName: \"kubernetes.io/projected/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-kube-api-access-vgd8k\") pod \"dnsmasq-dns-5b6747f4f9-jm2gc\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.382946 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4kws\" (UniqueName: \"kubernetes.io/projected/700c50aa-a63d-44ea-8c51-4b76771caf18-kube-api-access-c4kws\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.382976 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5wxr\" (UniqueName: \"kubernetes.io/projected/a661d676-d45b-4cf6-8b29-31c28fb2c46b-kube-api-access-q5wxr\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.390474 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.462126 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.720390 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-96c5-account-create-rqjrx" event={"ID":"a661d676-d45b-4cf6-8b29-31c28fb2c46b","Type":"ContainerDied","Data":"52ba9ee03f5240d43a100b796cb1ec5fd6305df0fb783b0c27b18d4be65a9ad4"} Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.720430 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-96c5-account-create-rqjrx" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.720441 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52ba9ee03f5240d43a100b796cb1ec5fd6305df0fb783b0c27b18d4be65a9ad4" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.733443 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8535-account-create-j68t5" event={"ID":"700c50aa-a63d-44ea-8c51-4b76771caf18","Type":"ContainerDied","Data":"9cd378bbe4b2fb3154c98c4468f37f7f0d54ca7213177d5bc5971bca973613d9"} Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.733513 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cd378bbe4b2fb3154c98c4468f37f7f0d54ca7213177d5bc5971bca973613d9" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.733591 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8535-account-create-j68t5" Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.759718 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6747f4f9-jm2gc"] Oct 01 13:11:43 crc kubenswrapper[4851]: W1001 13:11:43.766708 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd0e2b8e_610a_4a44_b7a7_41e7af05fde1.slice/crio-a6fc14c40426a06cd6203a4d72f9c38f10a8b71d9f78438570572446e232b317 WatchSource:0}: Error finding container a6fc14c40426a06cd6203a4d72f9c38f10a8b71d9f78438570572446e232b317: Status 404 returned error can't find the container with id a6fc14c40426a06cd6203a4d72f9c38f10a8b71d9f78438570572446e232b317 Oct 01 13:11:43 crc kubenswrapper[4851]: I1001 13:11:43.865866 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:11:44 crc kubenswrapper[4851]: I1001 13:11:44.340380 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="486b3f7c-3593-4950-b470-2d0a2f037e2f" path="/var/lib/kubelet/pods/486b3f7c-3593-4950-b470-2d0a2f037e2f/volumes" Oct 01 13:11:44 crc kubenswrapper[4851]: I1001 13:11:44.750204 4851 generic.go:334] "Generic (PLEG): container finished" podID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerID="772d5d5852d36e36f1bb3aaae8b3f5b540ed039eda11764bc681d2622ac6dde5" exitCode=0 Oct 01 13:11:44 crc kubenswrapper[4851]: I1001 13:11:44.750263 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" event={"ID":"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1","Type":"ContainerDied","Data":"772d5d5852d36e36f1bb3aaae8b3f5b540ed039eda11764bc681d2622ac6dde5"} Oct 01 13:11:44 crc kubenswrapper[4851]: I1001 13:11:44.750335 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" event={"ID":"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1","Type":"ContainerStarted","Data":"a6fc14c40426a06cd6203a4d72f9c38f10a8b71d9f78438570572446e232b317"} Oct 01 13:11:44 crc kubenswrapper[4851]: I1001 13:11:44.753695 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cde88b6a-b10f-4282-9f02-ad48a766a911","Type":"ContainerStarted","Data":"dc323bd3fd6c37f9c2367dd83735e1183bcdb3249eb1d97445cfa3a79c3ddf16"} Oct 01 13:11:45 crc kubenswrapper[4851]: I1001 13:11:45.764796 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" event={"ID":"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1","Type":"ContainerStarted","Data":"0a5516bbe7b63d7f5a6de326a8acefedb9dac9702050318a7689407ae1994a0a"} Oct 01 13:11:45 crc kubenswrapper[4851]: I1001 13:11:45.765211 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:45 crc kubenswrapper[4851]: I1001 13:11:45.787945 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" podStartSLOduration=2.787927459 podStartE2EDuration="2.787927459s" podCreationTimestamp="2025-10-01 13:11:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:11:45.786226171 +0000 UTC m=+1114.131343657" watchObservedRunningTime="2025-10-01 13:11:45.787927459 +0000 UTC m=+1114.133044945" Oct 01 13:11:45 crc kubenswrapper[4851]: I1001 13:11:45.844693 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:11:46 crc kubenswrapper[4851]: I1001 13:11:46.183679 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:11:46 crc kubenswrapper[4851]: I1001 13:11:46.406628 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 13:11:46 crc kubenswrapper[4851]: I1001 13:11:46.775956 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cde88b6a-b10f-4282-9f02-ad48a766a911","Type":"ContainerStarted","Data":"738d102c484953c3278b0eb697d0b8cb456b39fa78b02bc54fc0e106d151a536"} Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.514404 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-2dvwq"] Oct 01 13:11:48 crc kubenswrapper[4851]: E1001 13:11:48.519328 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a661d676-d45b-4cf6-8b29-31c28fb2c46b" containerName="mariadb-account-create" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.519347 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="a661d676-d45b-4cf6-8b29-31c28fb2c46b" containerName="mariadb-account-create" Oct 01 13:11:48 crc kubenswrapper[4851]: E1001 13:11:48.519378 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700c50aa-a63d-44ea-8c51-4b76771caf18" containerName="mariadb-account-create" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.519384 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="700c50aa-a63d-44ea-8c51-4b76771caf18" containerName="mariadb-account-create" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.519583 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="700c50aa-a63d-44ea-8c51-4b76771caf18" containerName="mariadb-account-create" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.519595 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="a661d676-d45b-4cf6-8b29-31c28fb2c46b" containerName="mariadb-account-create" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.520125 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2dvwq" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.524527 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2dvwq"] Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.596335 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5dgx\" (UniqueName: \"kubernetes.io/projected/a8efa1df-96f3-4954-881a-bff1284f380b-kube-api-access-w5dgx\") pod \"barbican-db-create-2dvwq\" (UID: \"a8efa1df-96f3-4954-881a-bff1284f380b\") " pod="openstack/barbican-db-create-2dvwq" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.615225 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-xg8vg"] Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.628392 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xg8vg" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.635228 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xg8vg"] Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.646878 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-f4s7j"] Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.649572 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.653241 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.653363 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-wwjvv" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.658431 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-f4s7j"] Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.698405 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5dgx\" (UniqueName: \"kubernetes.io/projected/a8efa1df-96f3-4954-881a-bff1284f380b-kube-api-access-w5dgx\") pod \"barbican-db-create-2dvwq\" (UID: \"a8efa1df-96f3-4954-881a-bff1284f380b\") " pod="openstack/barbican-db-create-2dvwq" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.698593 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbqv4\" (UniqueName: \"kubernetes.io/projected/e8779235-9b2e-48ec-a423-78de131a5da1-kube-api-access-rbqv4\") pod \"cinder-db-create-xg8vg\" (UID: \"e8779235-9b2e-48ec-a423-78de131a5da1\") " pod="openstack/cinder-db-create-xg8vg" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.742886 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5dgx\" (UniqueName: \"kubernetes.io/projected/a8efa1df-96f3-4954-881a-bff1284f380b-kube-api-access-w5dgx\") pod \"barbican-db-create-2dvwq\" (UID: \"a8efa1df-96f3-4954-881a-bff1284f380b\") " pod="openstack/barbican-db-create-2dvwq" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.800157 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6tft\" (UniqueName: \"kubernetes.io/projected/384e4797-6339-4742-9430-e87739c80e74-kube-api-access-l6tft\") pod \"watcher-db-sync-f4s7j\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.800206 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbqv4\" (UniqueName: \"kubernetes.io/projected/e8779235-9b2e-48ec-a423-78de131a5da1-kube-api-access-rbqv4\") pod \"cinder-db-create-xg8vg\" (UID: \"e8779235-9b2e-48ec-a423-78de131a5da1\") " pod="openstack/cinder-db-create-xg8vg" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.800312 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-config-data\") pod \"watcher-db-sync-f4s7j\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.800357 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-combined-ca-bundle\") pod \"watcher-db-sync-f4s7j\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.800376 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-db-sync-config-data\") pod \"watcher-db-sync-f4s7j\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.813141 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-lbkjz"] Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.816754 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lbkjz" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.823418 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbqv4\" (UniqueName: \"kubernetes.io/projected/e8779235-9b2e-48ec-a423-78de131a5da1-kube-api-access-rbqv4\") pod \"cinder-db-create-xg8vg\" (UID: \"e8779235-9b2e-48ec-a423-78de131a5da1\") " pod="openstack/cinder-db-create-xg8vg" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.825192 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-lbkjz"] Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.852600 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2dvwq" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.895728 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-7pnhj"] Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.901790 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-combined-ca-bundle\") pod \"watcher-db-sync-f4s7j\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.901846 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-db-sync-config-data\") pod \"watcher-db-sync-f4s7j\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.902049 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6tft\" (UniqueName: \"kubernetes.io/projected/384e4797-6339-4742-9430-e87739c80e74-kube-api-access-l6tft\") pod \"watcher-db-sync-f4s7j\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.902259 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-config-data\") pod \"watcher-db-sync-f4s7j\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.902330 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5zcn\" (UniqueName: \"kubernetes.io/projected/26c23b6f-4490-4676-ae47-dfdf989c1d4b-kube-api-access-c5zcn\") pod \"neutron-db-create-lbkjz\" (UID: \"26c23b6f-4490-4676-ae47-dfdf989c1d4b\") " pod="openstack/neutron-db-create-lbkjz" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.904839 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7pnhj"] Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.904934 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7pnhj" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.906247 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-db-sync-config-data\") pod \"watcher-db-sync-f4s7j\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.916316 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-combined-ca-bundle\") pod \"watcher-db-sync-f4s7j\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.916836 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.917139 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-config-data\") pod \"watcher-db-sync-f4s7j\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.917180 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.917466 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.920135 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vxcc4" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.920722 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6tft\" (UniqueName: \"kubernetes.io/projected/384e4797-6339-4742-9430-e87739c80e74-kube-api-access-l6tft\") pod \"watcher-db-sync-f4s7j\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.946830 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xg8vg" Oct 01 13:11:48 crc kubenswrapper[4851]: I1001 13:11:48.971841 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:11:49 crc kubenswrapper[4851]: I1001 13:11:49.004566 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d609db-765e-4137-8ed4-5364d359b402-combined-ca-bundle\") pod \"keystone-db-sync-7pnhj\" (UID: \"89d609db-765e-4137-8ed4-5364d359b402\") " pod="openstack/keystone-db-sync-7pnhj" Oct 01 13:11:49 crc kubenswrapper[4851]: I1001 13:11:49.004748 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5zcn\" (UniqueName: \"kubernetes.io/projected/26c23b6f-4490-4676-ae47-dfdf989c1d4b-kube-api-access-c5zcn\") pod \"neutron-db-create-lbkjz\" (UID: \"26c23b6f-4490-4676-ae47-dfdf989c1d4b\") " pod="openstack/neutron-db-create-lbkjz" Oct 01 13:11:49 crc kubenswrapper[4851]: I1001 13:11:49.004780 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtz45\" (UniqueName: \"kubernetes.io/projected/89d609db-765e-4137-8ed4-5364d359b402-kube-api-access-dtz45\") pod \"keystone-db-sync-7pnhj\" (UID: \"89d609db-765e-4137-8ed4-5364d359b402\") " pod="openstack/keystone-db-sync-7pnhj" Oct 01 13:11:49 crc kubenswrapper[4851]: I1001 13:11:49.004813 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d609db-765e-4137-8ed4-5364d359b402-config-data\") pod \"keystone-db-sync-7pnhj\" (UID: \"89d609db-765e-4137-8ed4-5364d359b402\") " pod="openstack/keystone-db-sync-7pnhj" Oct 01 13:11:49 crc kubenswrapper[4851]: I1001 13:11:49.019846 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5zcn\" (UniqueName: \"kubernetes.io/projected/26c23b6f-4490-4676-ae47-dfdf989c1d4b-kube-api-access-c5zcn\") pod \"neutron-db-create-lbkjz\" (UID: \"26c23b6f-4490-4676-ae47-dfdf989c1d4b\") " pod="openstack/neutron-db-create-lbkjz" Oct 01 13:11:49 crc kubenswrapper[4851]: I1001 13:11:49.106911 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d609db-765e-4137-8ed4-5364d359b402-combined-ca-bundle\") pod \"keystone-db-sync-7pnhj\" (UID: \"89d609db-765e-4137-8ed4-5364d359b402\") " pod="openstack/keystone-db-sync-7pnhj" Oct 01 13:11:49 crc kubenswrapper[4851]: I1001 13:11:49.106998 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtz45\" (UniqueName: \"kubernetes.io/projected/89d609db-765e-4137-8ed4-5364d359b402-kube-api-access-dtz45\") pod \"keystone-db-sync-7pnhj\" (UID: \"89d609db-765e-4137-8ed4-5364d359b402\") " pod="openstack/keystone-db-sync-7pnhj" Oct 01 13:11:49 crc kubenswrapper[4851]: I1001 13:11:49.107034 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d609db-765e-4137-8ed4-5364d359b402-config-data\") pod \"keystone-db-sync-7pnhj\" (UID: \"89d609db-765e-4137-8ed4-5364d359b402\") " pod="openstack/keystone-db-sync-7pnhj" Oct 01 13:11:49 crc kubenswrapper[4851]: I1001 13:11:49.113254 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d609db-765e-4137-8ed4-5364d359b402-config-data\") pod \"keystone-db-sync-7pnhj\" (UID: \"89d609db-765e-4137-8ed4-5364d359b402\") " pod="openstack/keystone-db-sync-7pnhj" Oct 01 13:11:49 crc kubenswrapper[4851]: I1001 13:11:49.113774 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d609db-765e-4137-8ed4-5364d359b402-combined-ca-bundle\") pod \"keystone-db-sync-7pnhj\" (UID: \"89d609db-765e-4137-8ed4-5364d359b402\") " pod="openstack/keystone-db-sync-7pnhj" Oct 01 13:11:49 crc kubenswrapper[4851]: I1001 13:11:49.125252 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtz45\" (UniqueName: \"kubernetes.io/projected/89d609db-765e-4137-8ed4-5364d359b402-kube-api-access-dtz45\") pod \"keystone-db-sync-7pnhj\" (UID: \"89d609db-765e-4137-8ed4-5364d359b402\") " pod="openstack/keystone-db-sync-7pnhj" Oct 01 13:11:49 crc kubenswrapper[4851]: I1001 13:11:49.190257 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lbkjz" Oct 01 13:11:49 crc kubenswrapper[4851]: I1001 13:11:49.301055 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7pnhj" Oct 01 13:11:52 crc kubenswrapper[4851]: I1001 13:11:52.844450 4851 generic.go:334] "Generic (PLEG): container finished" podID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerID="738d102c484953c3278b0eb697d0b8cb456b39fa78b02bc54fc0e106d151a536" exitCode=0 Oct 01 13:11:52 crc kubenswrapper[4851]: I1001 13:11:52.844575 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cde88b6a-b10f-4282-9f02-ad48a766a911","Type":"ContainerDied","Data":"738d102c484953c3278b0eb697d0b8cb456b39fa78b02bc54fc0e106d151a536"} Oct 01 13:11:53 crc kubenswrapper[4851]: I1001 13:11:53.464610 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:11:53 crc kubenswrapper[4851]: I1001 13:11:53.527994 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c66b849c-m2rtt"] Oct 01 13:11:53 crc kubenswrapper[4851]: I1001 13:11:53.528921 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" podUID="9b0bcde2-7523-41f2-bf95-17f4be1e55de" containerName="dnsmasq-dns" containerID="cri-o://30e5d47cf1ecb5ce3426302c5a3b0bedb12f82a2d45145740bfbfaec971d8312" gracePeriod=10 Oct 01 13:11:54 crc kubenswrapper[4851]: I1001 13:11:54.874441 4851 generic.go:334] "Generic (PLEG): container finished" podID="9b0bcde2-7523-41f2-bf95-17f4be1e55de" containerID="30e5d47cf1ecb5ce3426302c5a3b0bedb12f82a2d45145740bfbfaec971d8312" exitCode=0 Oct 01 13:11:54 crc kubenswrapper[4851]: I1001 13:11:54.874524 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" event={"ID":"9b0bcde2-7523-41f2-bf95-17f4be1e55de","Type":"ContainerDied","Data":"30e5d47cf1ecb5ce3426302c5a3b0bedb12f82a2d45145740bfbfaec971d8312"} Oct 01 13:11:56 crc kubenswrapper[4851]: I1001 13:11:56.118964 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-f4s7j"] Oct 01 13:11:56 crc kubenswrapper[4851]: I1001 13:11:56.211201 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2dvwq"] Oct 01 13:11:56 crc kubenswrapper[4851]: W1001 13:11:56.736480 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod384e4797_6339_4742_9430_e87739c80e74.slice/crio-58a76a9e7cd97418625e8cbdc573f82c4c3dd61fac2d73395c41aaaed82471fa WatchSource:0}: Error finding container 58a76a9e7cd97418625e8cbdc573f82c4c3dd61fac2d73395c41aaaed82471fa: Status 404 returned error can't find the container with id 58a76a9e7cd97418625e8cbdc573f82c4c3dd61fac2d73395c41aaaed82471fa Oct 01 13:11:56 crc kubenswrapper[4851]: I1001 13:11:56.900245 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2dvwq" event={"ID":"a8efa1df-96f3-4954-881a-bff1284f380b","Type":"ContainerStarted","Data":"490f9fff9371ded8b9157d7c8541f16eabd788d756228a64482896f99ad7619a"} Oct 01 13:11:56 crc kubenswrapper[4851]: I1001 13:11:56.903218 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" event={"ID":"9b0bcde2-7523-41f2-bf95-17f4be1e55de","Type":"ContainerDied","Data":"51b11d27b64c378964b9a4634c28df627d4f09a6e84c624a99ad92e4c4070fbd"} Oct 01 13:11:56 crc kubenswrapper[4851]: I1001 13:11:56.903259 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51b11d27b64c378964b9a4634c28df627d4f09a6e84c624a99ad92e4c4070fbd" Oct 01 13:11:56 crc kubenswrapper[4851]: I1001 13:11:56.905245 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-f4s7j" event={"ID":"384e4797-6339-4742-9430-e87739c80e74","Type":"ContainerStarted","Data":"58a76a9e7cd97418625e8cbdc573f82c4c3dd61fac2d73395c41aaaed82471fa"} Oct 01 13:11:56 crc kubenswrapper[4851]: I1001 13:11:56.935892 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.100366 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-ovsdbserver-sb\") pod \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.100991 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-dns-svc\") pod \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.101060 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfs9f\" (UniqueName: \"kubernetes.io/projected/9b0bcde2-7523-41f2-bf95-17f4be1e55de-kube-api-access-tfs9f\") pod \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.101150 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-config\") pod \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.101238 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-ovsdbserver-nb\") pod \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\" (UID: \"9b0bcde2-7523-41f2-bf95-17f4be1e55de\") " Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.111555 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b0bcde2-7523-41f2-bf95-17f4be1e55de-kube-api-access-tfs9f" (OuterVolumeSpecName: "kube-api-access-tfs9f") pod "9b0bcde2-7523-41f2-bf95-17f4be1e55de" (UID: "9b0bcde2-7523-41f2-bf95-17f4be1e55de"). InnerVolumeSpecName "kube-api-access-tfs9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.151513 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-config" (OuterVolumeSpecName: "config") pod "9b0bcde2-7523-41f2-bf95-17f4be1e55de" (UID: "9b0bcde2-7523-41f2-bf95-17f4be1e55de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.162594 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9b0bcde2-7523-41f2-bf95-17f4be1e55de" (UID: "9b0bcde2-7523-41f2-bf95-17f4be1e55de"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.174890 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9b0bcde2-7523-41f2-bf95-17f4be1e55de" (UID: "9b0bcde2-7523-41f2-bf95-17f4be1e55de"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.177160 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9b0bcde2-7523-41f2-bf95-17f4be1e55de" (UID: "9b0bcde2-7523-41f2-bf95-17f4be1e55de"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.187128 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-lbkjz"] Oct 01 13:11:57 crc kubenswrapper[4851]: W1001 13:11:57.195036 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26c23b6f_4490_4676_ae47_dfdf989c1d4b.slice/crio-1c187caea8213f682bf98f4e94b2739433b652d756fcc2e40dff5dca7c573ffa WatchSource:0}: Error finding container 1c187caea8213f682bf98f4e94b2739433b652d756fcc2e40dff5dca7c573ffa: Status 404 returned error can't find the container with id 1c187caea8213f682bf98f4e94b2739433b652d756fcc2e40dff5dca7c573ffa Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.202712 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.202738 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.202750 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfs9f\" (UniqueName: \"kubernetes.io/projected/9b0bcde2-7523-41f2-bf95-17f4be1e55de-kube-api-access-tfs9f\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.202761 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.202826 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b0bcde2-7523-41f2-bf95-17f4be1e55de-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.320867 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7pnhj"] Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.390338 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xg8vg"] Oct 01 13:11:57 crc kubenswrapper[4851]: W1001 13:11:57.399646 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8779235_9b2e_48ec_a423_78de131a5da1.slice/crio-05da7853203e7c45ac7acbb3df178725c87b99c8b3836c019c633120e9bd5729 WatchSource:0}: Error finding container 05da7853203e7c45ac7acbb3df178725c87b99c8b3836c019c633120e9bd5729: Status 404 returned error can't find the container with id 05da7853203e7c45ac7acbb3df178725c87b99c8b3836c019c633120e9bd5729 Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.917585 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hgsvg" event={"ID":"794e875d-440e-4a04-acf1-61b1e63c57d7","Type":"ContainerStarted","Data":"60b5959899e23c4f3fcb65a880fc3c306659793dcf421d6cb46dc8f1329ad197"} Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.919877 4851 generic.go:334] "Generic (PLEG): container finished" podID="26c23b6f-4490-4676-ae47-dfdf989c1d4b" containerID="33a47a2a8e2103d0d75691444679b66cb4cf7725a8330a5c4223416f51968e90" exitCode=0 Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.919957 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lbkjz" event={"ID":"26c23b6f-4490-4676-ae47-dfdf989c1d4b","Type":"ContainerDied","Data":"33a47a2a8e2103d0d75691444679b66cb4cf7725a8330a5c4223416f51968e90"} Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.919987 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lbkjz" event={"ID":"26c23b6f-4490-4676-ae47-dfdf989c1d4b","Type":"ContainerStarted","Data":"1c187caea8213f682bf98f4e94b2739433b652d756fcc2e40dff5dca7c573ffa"} Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.922201 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7pnhj" event={"ID":"89d609db-765e-4137-8ed4-5364d359b402","Type":"ContainerStarted","Data":"3f0a56d094a428a96b9f4556693d7b1789cf35209a79b0905511ec3bfafc5c88"} Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.924618 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cde88b6a-b10f-4282-9f02-ad48a766a911","Type":"ContainerStarted","Data":"caf8b1977df89ba68e1cbf9d955d42f5ae1587d0eaf188335c919e81164ed6ea"} Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.926152 4851 generic.go:334] "Generic (PLEG): container finished" podID="e8779235-9b2e-48ec-a423-78de131a5da1" containerID="6ad311226be69c015b67fa6f59649d1455561a1038bad5aa7727e1d0a0157490" exitCode=0 Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.926297 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xg8vg" event={"ID":"e8779235-9b2e-48ec-a423-78de131a5da1","Type":"ContainerDied","Data":"6ad311226be69c015b67fa6f59649d1455561a1038bad5aa7727e1d0a0157490"} Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.926337 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xg8vg" event={"ID":"e8779235-9b2e-48ec-a423-78de131a5da1","Type":"ContainerStarted","Data":"05da7853203e7c45ac7acbb3df178725c87b99c8b3836c019c633120e9bd5729"} Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.940257 4851 generic.go:334] "Generic (PLEG): container finished" podID="a8efa1df-96f3-4954-881a-bff1284f380b" containerID="edb64889e6f0e00f3865e72a0370bb34e56ec00fabef5cbe0effb40b0261a687" exitCode=0 Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.940321 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2dvwq" event={"ID":"a8efa1df-96f3-4954-881a-bff1284f380b","Type":"ContainerDied","Data":"edb64889e6f0e00f3865e72a0370bb34e56ec00fabef5cbe0effb40b0261a687"} Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.940356 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c66b849c-m2rtt" Oct 01 13:11:57 crc kubenswrapper[4851]: I1001 13:11:57.951262 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hgsvg" podStartSLOduration=2.609118039 podStartE2EDuration="17.951239944s" podCreationTimestamp="2025-10-01 13:11:40 +0000 UTC" firstStartedPulling="2025-10-01 13:11:41.512044543 +0000 UTC m=+1109.857162019" lastFinishedPulling="2025-10-01 13:11:56.854166448 +0000 UTC m=+1125.199283924" observedRunningTime="2025-10-01 13:11:57.93147264 +0000 UTC m=+1126.276590126" watchObservedRunningTime="2025-10-01 13:11:57.951239944 +0000 UTC m=+1126.296357430" Oct 01 13:11:58 crc kubenswrapper[4851]: I1001 13:11:58.009526 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c66b849c-m2rtt"] Oct 01 13:11:58 crc kubenswrapper[4851]: I1001 13:11:58.017011 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c66b849c-m2rtt"] Oct 01 13:11:58 crc kubenswrapper[4851]: I1001 13:11:58.342177 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b0bcde2-7523-41f2-bf95-17f4be1e55de" path="/var/lib/kubelet/pods/9b0bcde2-7523-41f2-bf95-17f4be1e55de/volumes" Oct 01 13:12:00 crc kubenswrapper[4851]: I1001 13:12:00.050269 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:12:00 crc kubenswrapper[4851]: I1001 13:12:00.050711 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:12:02 crc kubenswrapper[4851]: I1001 13:12:02.004489 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cde88b6a-b10f-4282-9f02-ad48a766a911","Type":"ContainerStarted","Data":"f5f5da7e72201369b37ad67af4f52722232c57eca697c7135aebd305ad5b96f4"} Oct 01 13:12:06 crc kubenswrapper[4851]: I1001 13:12:06.685134 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lbkjz" Oct 01 13:12:06 crc kubenswrapper[4851]: I1001 13:12:06.696648 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xg8vg" Oct 01 13:12:06 crc kubenswrapper[4851]: I1001 13:12:06.791525 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5zcn\" (UniqueName: \"kubernetes.io/projected/26c23b6f-4490-4676-ae47-dfdf989c1d4b-kube-api-access-c5zcn\") pod \"26c23b6f-4490-4676-ae47-dfdf989c1d4b\" (UID: \"26c23b6f-4490-4676-ae47-dfdf989c1d4b\") " Oct 01 13:12:06 crc kubenswrapper[4851]: I1001 13:12:06.799960 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c23b6f-4490-4676-ae47-dfdf989c1d4b-kube-api-access-c5zcn" (OuterVolumeSpecName: "kube-api-access-c5zcn") pod "26c23b6f-4490-4676-ae47-dfdf989c1d4b" (UID: "26c23b6f-4490-4676-ae47-dfdf989c1d4b"). InnerVolumeSpecName "kube-api-access-c5zcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:06 crc kubenswrapper[4851]: I1001 13:12:06.893640 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbqv4\" (UniqueName: \"kubernetes.io/projected/e8779235-9b2e-48ec-a423-78de131a5da1-kube-api-access-rbqv4\") pod \"e8779235-9b2e-48ec-a423-78de131a5da1\" (UID: \"e8779235-9b2e-48ec-a423-78de131a5da1\") " Oct 01 13:12:06 crc kubenswrapper[4851]: I1001 13:12:06.894277 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5zcn\" (UniqueName: \"kubernetes.io/projected/26c23b6f-4490-4676-ae47-dfdf989c1d4b-kube-api-access-c5zcn\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:06 crc kubenswrapper[4851]: I1001 13:12:06.898942 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8779235-9b2e-48ec-a423-78de131a5da1-kube-api-access-rbqv4" (OuterVolumeSpecName: "kube-api-access-rbqv4") pod "e8779235-9b2e-48ec-a423-78de131a5da1" (UID: "e8779235-9b2e-48ec-a423-78de131a5da1"). InnerVolumeSpecName "kube-api-access-rbqv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:06 crc kubenswrapper[4851]: I1001 13:12:06.995620 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbqv4\" (UniqueName: \"kubernetes.io/projected/e8779235-9b2e-48ec-a423-78de131a5da1-kube-api-access-rbqv4\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:07 crc kubenswrapper[4851]: I1001 13:12:07.071313 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lbkjz" event={"ID":"26c23b6f-4490-4676-ae47-dfdf989c1d4b","Type":"ContainerDied","Data":"1c187caea8213f682bf98f4e94b2739433b652d756fcc2e40dff5dca7c573ffa"} Oct 01 13:12:07 crc kubenswrapper[4851]: I1001 13:12:07.071350 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c187caea8213f682bf98f4e94b2739433b652d756fcc2e40dff5dca7c573ffa" Oct 01 13:12:07 crc kubenswrapper[4851]: I1001 13:12:07.071400 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lbkjz" Oct 01 13:12:07 crc kubenswrapper[4851]: I1001 13:12:07.078875 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xg8vg" event={"ID":"e8779235-9b2e-48ec-a423-78de131a5da1","Type":"ContainerDied","Data":"05da7853203e7c45ac7acbb3df178725c87b99c8b3836c019c633120e9bd5729"} Oct 01 13:12:07 crc kubenswrapper[4851]: I1001 13:12:07.078913 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05da7853203e7c45ac7acbb3df178725c87b99c8b3836c019c633120e9bd5729" Oct 01 13:12:07 crc kubenswrapper[4851]: I1001 13:12:07.078997 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xg8vg" Oct 01 13:12:11 crc kubenswrapper[4851]: I1001 13:12:11.707223 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2dvwq" Oct 01 13:12:11 crc kubenswrapper[4851]: I1001 13:12:11.718268 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5dgx\" (UniqueName: \"kubernetes.io/projected/a8efa1df-96f3-4954-881a-bff1284f380b-kube-api-access-w5dgx\") pod \"a8efa1df-96f3-4954-881a-bff1284f380b\" (UID: \"a8efa1df-96f3-4954-881a-bff1284f380b\") " Oct 01 13:12:11 crc kubenswrapper[4851]: I1001 13:12:11.727891 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8efa1df-96f3-4954-881a-bff1284f380b-kube-api-access-w5dgx" (OuterVolumeSpecName: "kube-api-access-w5dgx") pod "a8efa1df-96f3-4954-881a-bff1284f380b" (UID: "a8efa1df-96f3-4954-881a-bff1284f380b"). InnerVolumeSpecName "kube-api-access-w5dgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:11 crc kubenswrapper[4851]: I1001 13:12:11.820960 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5dgx\" (UniqueName: \"kubernetes.io/projected/a8efa1df-96f3-4954-881a-bff1284f380b-kube-api-access-w5dgx\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:12 crc kubenswrapper[4851]: I1001 13:12:12.165935 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2dvwq" event={"ID":"a8efa1df-96f3-4954-881a-bff1284f380b","Type":"ContainerDied","Data":"490f9fff9371ded8b9157d7c8541f16eabd788d756228a64482896f99ad7619a"} Oct 01 13:12:12 crc kubenswrapper[4851]: I1001 13:12:12.165981 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="490f9fff9371ded8b9157d7c8541f16eabd788d756228a64482896f99ad7619a" Oct 01 13:12:12 crc kubenswrapper[4851]: I1001 13:12:12.165998 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2dvwq" Oct 01 13:12:12 crc kubenswrapper[4851]: E1001 13:12:12.227379 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-watcher-api:watcher_latest" Oct 01 13:12:12 crc kubenswrapper[4851]: E1001 13:12:12.227435 4851 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-watcher-api:watcher_latest" Oct 01 13:12:12 crc kubenswrapper[4851]: E1001 13:12:12.227586 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-db-sync,Image:38.102.83.36:5001/podified-master-centos10/openstack-watcher-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l6tft,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-db-sync-f4s7j_openstack(384e4797-6339-4742-9430-e87739c80e74): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:12:12 crc kubenswrapper[4851]: E1001 13:12:12.228838 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/watcher-db-sync-f4s7j" podUID="384e4797-6339-4742-9430-e87739c80e74" Oct 01 13:12:13 crc kubenswrapper[4851]: I1001 13:12:13.183323 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7pnhj" event={"ID":"89d609db-765e-4137-8ed4-5364d359b402","Type":"ContainerStarted","Data":"6ca0ce32e81fb1f415203061c017037cb28ba1c7963705272fb9f329d843af73"} Oct 01 13:12:13 crc kubenswrapper[4851]: I1001 13:12:13.191616 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cde88b6a-b10f-4282-9f02-ad48a766a911","Type":"ContainerStarted","Data":"be33d65d4e1fa8a9798a0fbebead9be31a2ceabe441b0fbc11a7cf73b6d3bae9"} Oct 01 13:12:13 crc kubenswrapper[4851]: E1001 13:12:13.194842 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/podified-master-centos10/openstack-watcher-api:watcher_latest\\\"\"" pod="openstack/watcher-db-sync-f4s7j" podUID="384e4797-6339-4742-9430-e87739c80e74" Oct 01 13:12:13 crc kubenswrapper[4851]: I1001 13:12:13.225989 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-7pnhj" podStartSLOduration=10.387810595 podStartE2EDuration="25.225957733s" podCreationTimestamp="2025-10-01 13:11:48 +0000 UTC" firstStartedPulling="2025-10-01 13:11:57.362613171 +0000 UTC m=+1125.707730657" lastFinishedPulling="2025-10-01 13:12:12.200760309 +0000 UTC m=+1140.545877795" observedRunningTime="2025-10-01 13:12:13.211963334 +0000 UTC m=+1141.557080830" watchObservedRunningTime="2025-10-01 13:12:13.225957733 +0000 UTC m=+1141.571075259" Oct 01 13:12:13 crc kubenswrapper[4851]: I1001 13:12:13.261192 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=31.261158588 podStartE2EDuration="31.261158588s" podCreationTimestamp="2025-10-01 13:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:12:13.257855814 +0000 UTC m=+1141.602973330" watchObservedRunningTime="2025-10-01 13:12:13.261158588 +0000 UTC m=+1141.606276114" Oct 01 13:12:13 crc kubenswrapper[4851]: I1001 13:12:13.391532 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 01 13:12:13 crc kubenswrapper[4851]: I1001 13:12:13.391591 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 01 13:12:13 crc kubenswrapper[4851]: I1001 13:12:13.397246 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 01 13:12:14 crc kubenswrapper[4851]: I1001 13:12:14.203229 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.578263 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-aacf-account-create-dsc7w"] Oct 01 13:12:18 crc kubenswrapper[4851]: E1001 13:12:18.579459 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8779235-9b2e-48ec-a423-78de131a5da1" containerName="mariadb-database-create" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.579481 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8779235-9b2e-48ec-a423-78de131a5da1" containerName="mariadb-database-create" Oct 01 13:12:18 crc kubenswrapper[4851]: E1001 13:12:18.580854 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8efa1df-96f3-4954-881a-bff1284f380b" containerName="mariadb-database-create" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.580876 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8efa1df-96f3-4954-881a-bff1284f380b" containerName="mariadb-database-create" Oct 01 13:12:18 crc kubenswrapper[4851]: E1001 13:12:18.580918 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0bcde2-7523-41f2-bf95-17f4be1e55de" containerName="dnsmasq-dns" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.580932 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0bcde2-7523-41f2-bf95-17f4be1e55de" containerName="dnsmasq-dns" Oct 01 13:12:18 crc kubenswrapper[4851]: E1001 13:12:18.580955 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0bcde2-7523-41f2-bf95-17f4be1e55de" containerName="init" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.580966 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0bcde2-7523-41f2-bf95-17f4be1e55de" containerName="init" Oct 01 13:12:18 crc kubenswrapper[4851]: E1001 13:12:18.580984 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c23b6f-4490-4676-ae47-dfdf989c1d4b" containerName="mariadb-database-create" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.580995 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c23b6f-4490-4676-ae47-dfdf989c1d4b" containerName="mariadb-database-create" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.581358 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8efa1df-96f3-4954-881a-bff1284f380b" containerName="mariadb-database-create" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.581388 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c23b6f-4490-4676-ae47-dfdf989c1d4b" containerName="mariadb-database-create" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.581415 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8779235-9b2e-48ec-a423-78de131a5da1" containerName="mariadb-database-create" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.581432 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b0bcde2-7523-41f2-bf95-17f4be1e55de" containerName="dnsmasq-dns" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.582367 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aacf-account-create-dsc7w" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.585267 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.590627 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-aacf-account-create-dsc7w"] Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.648712 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzxt4\" (UniqueName: \"kubernetes.io/projected/f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc-kube-api-access-nzxt4\") pod \"barbican-aacf-account-create-dsc7w\" (UID: \"f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc\") " pod="openstack/barbican-aacf-account-create-dsc7w" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.663401 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-6405-account-create-9stkn"] Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.664714 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6405-account-create-9stkn" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.666909 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.688388 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6405-account-create-9stkn"] Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.750811 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d28rg\" (UniqueName: \"kubernetes.io/projected/eeff4acd-ae99-4646-a186-6bdfb431a52d-kube-api-access-d28rg\") pod \"cinder-6405-account-create-9stkn\" (UID: \"eeff4acd-ae99-4646-a186-6bdfb431a52d\") " pod="openstack/cinder-6405-account-create-9stkn" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.750892 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzxt4\" (UniqueName: \"kubernetes.io/projected/f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc-kube-api-access-nzxt4\") pod \"barbican-aacf-account-create-dsc7w\" (UID: \"f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc\") " pod="openstack/barbican-aacf-account-create-dsc7w" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.787290 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzxt4\" (UniqueName: \"kubernetes.io/projected/f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc-kube-api-access-nzxt4\") pod \"barbican-aacf-account-create-dsc7w\" (UID: \"f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc\") " pod="openstack/barbican-aacf-account-create-dsc7w" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.852475 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d28rg\" (UniqueName: \"kubernetes.io/projected/eeff4acd-ae99-4646-a186-6bdfb431a52d-kube-api-access-d28rg\") pod \"cinder-6405-account-create-9stkn\" (UID: \"eeff4acd-ae99-4646-a186-6bdfb431a52d\") " pod="openstack/cinder-6405-account-create-9stkn" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.880793 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d28rg\" (UniqueName: \"kubernetes.io/projected/eeff4acd-ae99-4646-a186-6bdfb431a52d-kube-api-access-d28rg\") pod \"cinder-6405-account-create-9stkn\" (UID: \"eeff4acd-ae99-4646-a186-6bdfb431a52d\") " pod="openstack/cinder-6405-account-create-9stkn" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.920176 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aacf-account-create-dsc7w" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.954316 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-02e4-account-create-wtbvx"] Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.955662 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-02e4-account-create-wtbvx" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.958365 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.988599 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-02e4-account-create-wtbvx"] Oct 01 13:12:18 crc kubenswrapper[4851]: I1001 13:12:18.992951 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6405-account-create-9stkn" Oct 01 13:12:19 crc kubenswrapper[4851]: I1001 13:12:19.063006 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwbtq\" (UniqueName: \"kubernetes.io/projected/d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4-kube-api-access-dwbtq\") pod \"neutron-02e4-account-create-wtbvx\" (UID: \"d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4\") " pod="openstack/neutron-02e4-account-create-wtbvx" Oct 01 13:12:19 crc kubenswrapper[4851]: I1001 13:12:19.164544 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwbtq\" (UniqueName: \"kubernetes.io/projected/d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4-kube-api-access-dwbtq\") pod \"neutron-02e4-account-create-wtbvx\" (UID: \"d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4\") " pod="openstack/neutron-02e4-account-create-wtbvx" Oct 01 13:12:19 crc kubenswrapper[4851]: I1001 13:12:19.183662 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwbtq\" (UniqueName: \"kubernetes.io/projected/d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4-kube-api-access-dwbtq\") pod \"neutron-02e4-account-create-wtbvx\" (UID: \"d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4\") " pod="openstack/neutron-02e4-account-create-wtbvx" Oct 01 13:12:19 crc kubenswrapper[4851]: I1001 13:12:19.408381 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-aacf-account-create-dsc7w"] Oct 01 13:12:19 crc kubenswrapper[4851]: I1001 13:12:19.416609 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-02e4-account-create-wtbvx" Oct 01 13:12:19 crc kubenswrapper[4851]: W1001 13:12:19.420038 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf69aa5c0_2e72_4563_a8f8_5915b4bd0dbc.slice/crio-33ca5592a2acc4618e9a013946f61570c28273697111a3c89ed82ede5d108fa9 WatchSource:0}: Error finding container 33ca5592a2acc4618e9a013946f61570c28273697111a3c89ed82ede5d108fa9: Status 404 returned error can't find the container with id 33ca5592a2acc4618e9a013946f61570c28273697111a3c89ed82ede5d108fa9 Oct 01 13:12:19 crc kubenswrapper[4851]: W1001 13:12:19.504208 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeeff4acd_ae99_4646_a186_6bdfb431a52d.slice/crio-0978df98c68650a45ead6417c86878062c5509607cf7d4e80d10f6e3c9ace56f WatchSource:0}: Error finding container 0978df98c68650a45ead6417c86878062c5509607cf7d4e80d10f6e3c9ace56f: Status 404 returned error can't find the container with id 0978df98c68650a45ead6417c86878062c5509607cf7d4e80d10f6e3c9ace56f Oct 01 13:12:19 crc kubenswrapper[4851]: I1001 13:12:19.507156 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6405-account-create-9stkn"] Oct 01 13:12:20 crc kubenswrapper[4851]: I1001 13:12:20.018316 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-02e4-account-create-wtbvx"] Oct 01 13:12:20 crc kubenswrapper[4851]: I1001 13:12:20.265618 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-02e4-account-create-wtbvx" event={"ID":"d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4","Type":"ContainerStarted","Data":"a503b22708c0c59620b22b0206a1ea38fdd6791290d8c38aadebbcc38398a25b"} Oct 01 13:12:20 crc kubenswrapper[4851]: I1001 13:12:20.265969 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-02e4-account-create-wtbvx" event={"ID":"d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4","Type":"ContainerStarted","Data":"e3f1391ed5dbdd146e892e45cb7219c3940de87c224aeb6fa3de1d86d90de208"} Oct 01 13:12:20 crc kubenswrapper[4851]: I1001 13:12:20.267388 4851 generic.go:334] "Generic (PLEG): container finished" podID="eeff4acd-ae99-4646-a186-6bdfb431a52d" containerID="0896d580957f4080564c45aaf5cd9e3ed1e67e84ae1881f043ae0f1f24723a8e" exitCode=0 Oct 01 13:12:20 crc kubenswrapper[4851]: I1001 13:12:20.267438 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6405-account-create-9stkn" event={"ID":"eeff4acd-ae99-4646-a186-6bdfb431a52d","Type":"ContainerDied","Data":"0896d580957f4080564c45aaf5cd9e3ed1e67e84ae1881f043ae0f1f24723a8e"} Oct 01 13:12:20 crc kubenswrapper[4851]: I1001 13:12:20.267454 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6405-account-create-9stkn" event={"ID":"eeff4acd-ae99-4646-a186-6bdfb431a52d","Type":"ContainerStarted","Data":"0978df98c68650a45ead6417c86878062c5509607cf7d4e80d10f6e3c9ace56f"} Oct 01 13:12:20 crc kubenswrapper[4851]: I1001 13:12:20.269476 4851 generic.go:334] "Generic (PLEG): container finished" podID="f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc" containerID="025dbe9a776e103362c1718f705a3118f52d486ce84bd252907be0b21419bb46" exitCode=0 Oct 01 13:12:20 crc kubenswrapper[4851]: I1001 13:12:20.269620 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aacf-account-create-dsc7w" event={"ID":"f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc","Type":"ContainerDied","Data":"025dbe9a776e103362c1718f705a3118f52d486ce84bd252907be0b21419bb46"} Oct 01 13:12:20 crc kubenswrapper[4851]: I1001 13:12:20.269653 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aacf-account-create-dsc7w" event={"ID":"f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc","Type":"ContainerStarted","Data":"33ca5592a2acc4618e9a013946f61570c28273697111a3c89ed82ede5d108fa9"} Oct 01 13:12:20 crc kubenswrapper[4851]: I1001 13:12:20.275681 4851 generic.go:334] "Generic (PLEG): container finished" podID="89d609db-765e-4137-8ed4-5364d359b402" containerID="6ca0ce32e81fb1f415203061c017037cb28ba1c7963705272fb9f329d843af73" exitCode=0 Oct 01 13:12:20 crc kubenswrapper[4851]: I1001 13:12:20.275725 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7pnhj" event={"ID":"89d609db-765e-4137-8ed4-5364d359b402","Type":"ContainerDied","Data":"6ca0ce32e81fb1f415203061c017037cb28ba1c7963705272fb9f329d843af73"} Oct 01 13:12:20 crc kubenswrapper[4851]: I1001 13:12:20.287405 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-02e4-account-create-wtbvx" podStartSLOduration=2.287383783 podStartE2EDuration="2.287383783s" podCreationTimestamp="2025-10-01 13:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:12:20.281419543 +0000 UTC m=+1148.626537049" watchObservedRunningTime="2025-10-01 13:12:20.287383783 +0000 UTC m=+1148.632501269" Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.287158 4851 generic.go:334] "Generic (PLEG): container finished" podID="d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4" containerID="a503b22708c0c59620b22b0206a1ea38fdd6791290d8c38aadebbcc38398a25b" exitCode=0 Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.287381 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-02e4-account-create-wtbvx" event={"ID":"d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4","Type":"ContainerDied","Data":"a503b22708c0c59620b22b0206a1ea38fdd6791290d8c38aadebbcc38398a25b"} Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.290828 4851 generic.go:334] "Generic (PLEG): container finished" podID="794e875d-440e-4a04-acf1-61b1e63c57d7" containerID="60b5959899e23c4f3fcb65a880fc3c306659793dcf421d6cb46dc8f1329ad197" exitCode=0 Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.291048 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hgsvg" event={"ID":"794e875d-440e-4a04-acf1-61b1e63c57d7","Type":"ContainerDied","Data":"60b5959899e23c4f3fcb65a880fc3c306659793dcf421d6cb46dc8f1329ad197"} Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.821226 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7pnhj" Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.828458 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aacf-account-create-dsc7w" Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.837243 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6405-account-create-9stkn" Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.920654 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d609db-765e-4137-8ed4-5364d359b402-config-data\") pod \"89d609db-765e-4137-8ed4-5364d359b402\" (UID: \"89d609db-765e-4137-8ed4-5364d359b402\") " Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.920760 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d28rg\" (UniqueName: \"kubernetes.io/projected/eeff4acd-ae99-4646-a186-6bdfb431a52d-kube-api-access-d28rg\") pod \"eeff4acd-ae99-4646-a186-6bdfb431a52d\" (UID: \"eeff4acd-ae99-4646-a186-6bdfb431a52d\") " Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.920818 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzxt4\" (UniqueName: \"kubernetes.io/projected/f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc-kube-api-access-nzxt4\") pod \"f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc\" (UID: \"f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc\") " Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.920861 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d609db-765e-4137-8ed4-5364d359b402-combined-ca-bundle\") pod \"89d609db-765e-4137-8ed4-5364d359b402\" (UID: \"89d609db-765e-4137-8ed4-5364d359b402\") " Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.921005 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtz45\" (UniqueName: \"kubernetes.io/projected/89d609db-765e-4137-8ed4-5364d359b402-kube-api-access-dtz45\") pod \"89d609db-765e-4137-8ed4-5364d359b402\" (UID: \"89d609db-765e-4137-8ed4-5364d359b402\") " Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.928792 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeff4acd-ae99-4646-a186-6bdfb431a52d-kube-api-access-d28rg" (OuterVolumeSpecName: "kube-api-access-d28rg") pod "eeff4acd-ae99-4646-a186-6bdfb431a52d" (UID: "eeff4acd-ae99-4646-a186-6bdfb431a52d"). InnerVolumeSpecName "kube-api-access-d28rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.928878 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89d609db-765e-4137-8ed4-5364d359b402-kube-api-access-dtz45" (OuterVolumeSpecName: "kube-api-access-dtz45") pod "89d609db-765e-4137-8ed4-5364d359b402" (UID: "89d609db-765e-4137-8ed4-5364d359b402"). InnerVolumeSpecName "kube-api-access-dtz45". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.928906 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc-kube-api-access-nzxt4" (OuterVolumeSpecName: "kube-api-access-nzxt4") pod "f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc" (UID: "f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc"). InnerVolumeSpecName "kube-api-access-nzxt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.948407 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89d609db-765e-4137-8ed4-5364d359b402-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89d609db-765e-4137-8ed4-5364d359b402" (UID: "89d609db-765e-4137-8ed4-5364d359b402"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:21 crc kubenswrapper[4851]: I1001 13:12:21.974479 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89d609db-765e-4137-8ed4-5364d359b402-config-data" (OuterVolumeSpecName: "config-data") pod "89d609db-765e-4137-8ed4-5364d359b402" (UID: "89d609db-765e-4137-8ed4-5364d359b402"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.024968 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtz45\" (UniqueName: \"kubernetes.io/projected/89d609db-765e-4137-8ed4-5364d359b402-kube-api-access-dtz45\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.025022 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d609db-765e-4137-8ed4-5364d359b402-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.025042 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d28rg\" (UniqueName: \"kubernetes.io/projected/eeff4acd-ae99-4646-a186-6bdfb431a52d-kube-api-access-d28rg\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.025061 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzxt4\" (UniqueName: \"kubernetes.io/projected/f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc-kube-api-access-nzxt4\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.025079 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d609db-765e-4137-8ed4-5364d359b402-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.305240 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6405-account-create-9stkn" event={"ID":"eeff4acd-ae99-4646-a186-6bdfb431a52d","Type":"ContainerDied","Data":"0978df98c68650a45ead6417c86878062c5509607cf7d4e80d10f6e3c9ace56f"} Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.305299 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0978df98c68650a45ead6417c86878062c5509607cf7d4e80d10f6e3c9ace56f" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.305313 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6405-account-create-9stkn" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.307670 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aacf-account-create-dsc7w" event={"ID":"f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc","Type":"ContainerDied","Data":"33ca5592a2acc4618e9a013946f61570c28273697111a3c89ed82ede5d108fa9"} Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.307907 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33ca5592a2acc4618e9a013946f61570c28273697111a3c89ed82ede5d108fa9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.308053 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aacf-account-create-dsc7w" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.310415 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7pnhj" event={"ID":"89d609db-765e-4137-8ed4-5364d359b402","Type":"ContainerDied","Data":"3f0a56d094a428a96b9f4556693d7b1789cf35209a79b0905511ec3bfafc5c88"} Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.310574 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7pnhj" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.310596 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f0a56d094a428a96b9f4556693d7b1789cf35209a79b0905511ec3bfafc5c88" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.487934 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fd86bf475-zh9k8"] Oct 01 13:12:22 crc kubenswrapper[4851]: E1001 13:12:22.488342 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89d609db-765e-4137-8ed4-5364d359b402" containerName="keystone-db-sync" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.488358 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="89d609db-765e-4137-8ed4-5364d359b402" containerName="keystone-db-sync" Oct 01 13:12:22 crc kubenswrapper[4851]: E1001 13:12:22.488394 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeff4acd-ae99-4646-a186-6bdfb431a52d" containerName="mariadb-account-create" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.488400 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeff4acd-ae99-4646-a186-6bdfb431a52d" containerName="mariadb-account-create" Oct 01 13:12:22 crc kubenswrapper[4851]: E1001 13:12:22.488429 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc" containerName="mariadb-account-create" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.488435 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc" containerName="mariadb-account-create" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.497099 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc" containerName="mariadb-account-create" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.497167 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeff4acd-ae99-4646-a186-6bdfb431a52d" containerName="mariadb-account-create" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.497204 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="89d609db-765e-4137-8ed4-5364d359b402" containerName="keystone-db-sync" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.503646 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.566990 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fd86bf475-zh9k8"] Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.584356 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-b6cn9"] Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.585583 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.589716 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.589969 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.590078 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vxcc4" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.590159 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.620067 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b6cn9"] Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.661444 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch26x\" (UniqueName: \"kubernetes.io/projected/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-kube-api-access-ch26x\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.661535 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-ovsdbserver-sb\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.661573 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-dns-swift-storage-0\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.661612 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-ovsdbserver-nb\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.661650 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-credential-keys\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.661695 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-scripts\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.661719 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-dns-svc\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.661737 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-fernet-keys\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.661768 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-config-data\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.661799 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-config\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.661824 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-combined-ca-bundle\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.661844 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42m72\" (UniqueName: \"kubernetes.io/projected/41001489-8c26-4e07-a72d-f4fe8d4fe636-kube-api-access-42m72\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.723569 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-ddb6c45b5-ncjrz"] Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.725329 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.728811 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.729086 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-fk7jg" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.731261 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.731429 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.749415 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-ddb6c45b5-ncjrz"] Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.767229 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-ovsdbserver-sb\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.767298 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-dns-swift-storage-0\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.767338 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-ovsdbserver-nb\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.767370 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-credential-keys\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.767418 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-scripts\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.767448 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-dns-svc\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.767467 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-fernet-keys\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.767516 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-config-data\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.767549 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-config\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.767574 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-combined-ca-bundle\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.767593 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42m72\" (UniqueName: \"kubernetes.io/projected/41001489-8c26-4e07-a72d-f4fe8d4fe636-kube-api-access-42m72\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.767614 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch26x\" (UniqueName: \"kubernetes.io/projected/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-kube-api-access-ch26x\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.769854 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-dns-svc\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.770017 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-ovsdbserver-nb\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.770626 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-dns-swift-storage-0\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.770790 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-ovsdbserver-sb\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.771618 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-config\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.817916 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-credential-keys\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.818111 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-scripts\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.818341 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42m72\" (UniqueName: \"kubernetes.io/projected/41001489-8c26-4e07-a72d-f4fe8d4fe636-kube-api-access-42m72\") pod \"dnsmasq-dns-fd86bf475-zh9k8\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.818572 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-fernet-keys\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.818934 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch26x\" (UniqueName: \"kubernetes.io/projected/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-kube-api-access-ch26x\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.818956 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-config-data\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.830893 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.830988 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-combined-ca-bundle\") pod \"keystone-bootstrap-b6cn9\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.875660 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ac095f6-4433-4c0e-9299-dcad17dff9fa-logs\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.875723 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ac095f6-4433-4c0e-9299-dcad17dff9fa-config-data\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.875788 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ac095f6-4433-4c0e-9299-dcad17dff9fa-scripts\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.875835 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ac095f6-4433-4c0e-9299-dcad17dff9fa-horizon-secret-key\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.875857 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs4rg\" (UniqueName: \"kubernetes.io/projected/2ac095f6-4433-4c0e-9299-dcad17dff9fa-kube-api-access-xs4rg\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.879172 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-86559649f7-slxjz"] Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.890711 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:22 crc kubenswrapper[4851]: I1001 13:12:22.928146 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:22.993264 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9843bd36-0528-4494-9b3f-f1975e2b15f2-horizon-secret-key\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:22.993302 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pmds\" (UniqueName: \"kubernetes.io/projected/9843bd36-0528-4494-9b3f-f1975e2b15f2-kube-api-access-5pmds\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:22.993333 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9843bd36-0528-4494-9b3f-f1975e2b15f2-logs\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:22.993351 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9843bd36-0528-4494-9b3f-f1975e2b15f2-scripts\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:22.993373 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ac095f6-4433-4c0e-9299-dcad17dff9fa-logs\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:22.993400 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ac095f6-4433-4c0e-9299-dcad17dff9fa-config-data\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:22.993425 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9843bd36-0528-4494-9b3f-f1975e2b15f2-config-data\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:22.993463 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ac095f6-4433-4c0e-9299-dcad17dff9fa-scripts\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:22.993525 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ac095f6-4433-4c0e-9299-dcad17dff9fa-horizon-secret-key\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:22.993546 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs4rg\" (UniqueName: \"kubernetes.io/projected/2ac095f6-4433-4c0e-9299-dcad17dff9fa-kube-api-access-xs4rg\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:22.993808 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86559649f7-slxjz"] Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:22.994196 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ac095f6-4433-4c0e-9299-dcad17dff9fa-logs\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.006953 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ac095f6-4433-4c0e-9299-dcad17dff9fa-config-data\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.012155 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ac095f6-4433-4c0e-9299-dcad17dff9fa-scripts\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.032377 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs4rg\" (UniqueName: \"kubernetes.io/projected/2ac095f6-4433-4c0e-9299-dcad17dff9fa-kube-api-access-xs4rg\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.048965 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ac095f6-4433-4c0e-9299-dcad17dff9fa-horizon-secret-key\") pod \"horizon-ddb6c45b5-ncjrz\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.052590 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.068227 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-02e4-account-create-wtbvx" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.096762 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9843bd36-0528-4494-9b3f-f1975e2b15f2-config-data\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.096881 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9843bd36-0528-4494-9b3f-f1975e2b15f2-horizon-secret-key\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.096911 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pmds\" (UniqueName: \"kubernetes.io/projected/9843bd36-0528-4494-9b3f-f1975e2b15f2-kube-api-access-5pmds\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.096963 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9843bd36-0528-4494-9b3f-f1975e2b15f2-logs\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.096981 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9843bd36-0528-4494-9b3f-f1975e2b15f2-scripts\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.097611 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9843bd36-0528-4494-9b3f-f1975e2b15f2-scripts\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.098488 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9843bd36-0528-4494-9b3f-f1975e2b15f2-config-data\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.102929 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9843bd36-0528-4494-9b3f-f1975e2b15f2-logs\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.131128 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9843bd36-0528-4494-9b3f-f1975e2b15f2-horizon-secret-key\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.151562 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd86bf475-zh9k8"] Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.171310 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pmds\" (UniqueName: \"kubernetes.io/projected/9843bd36-0528-4494-9b3f-f1975e2b15f2-kube-api-access-5pmds\") pod \"horizon-86559649f7-slxjz\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.198170 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwbtq\" (UniqueName: \"kubernetes.io/projected/d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4-kube-api-access-dwbtq\") pod \"d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4\" (UID: \"d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4\") " Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.204366 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-gk28q"] Oct 01 13:12:23 crc kubenswrapper[4851]: E1001 13:12:23.204898 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4" containerName="mariadb-account-create" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.204919 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4" containerName="mariadb-account-create" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.205149 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4" containerName="mariadb-account-create" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.205881 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.211723 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4-kube-api-access-dwbtq" (OuterVolumeSpecName: "kube-api-access-dwbtq") pod "d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4" (UID: "d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4"). InnerVolumeSpecName "kube-api-access-dwbtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.219321 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.222446 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qz7dn" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.222493 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.247562 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gk28q"] Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.257405 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7644495549-mb5sr"] Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.265165 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.280774 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7644495549-mb5sr"] Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.290421 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.294822 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.301103 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.301339 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.302250 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-625qw\" (UniqueName: \"kubernetes.io/projected/a1eb277b-5338-4d54-939f-6d636f34d14c-kube-api-access-625qw\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.302335 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-scripts\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.302416 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-config-data\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.302450 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-combined-ca-bundle\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.302533 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1eb277b-5338-4d54-939f-6d636f34d14c-logs\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.302675 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwbtq\" (UniqueName: \"kubernetes.io/projected/d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4-kube-api-access-dwbtq\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.304700 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.324752 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-02e4-account-create-wtbvx" event={"ID":"d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4","Type":"ContainerDied","Data":"e3f1391ed5dbdd146e892e45cb7219c3940de87c224aeb6fa3de1d86d90de208"} Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.324800 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3f1391ed5dbdd146e892e45cb7219c3940de87c224aeb6fa3de1d86d90de208" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.325005 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-02e4-account-create-wtbvx" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.352791 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.403671 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-ovsdbserver-sb\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.403733 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-scripts\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.403924 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-config-data\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.403977 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-combined-ca-bundle\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.403998 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-log-httpd\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.404045 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjz5p\" (UniqueName: \"kubernetes.io/projected/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-kube-api-access-hjz5p\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.404125 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-run-httpd\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.404155 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.404178 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1eb277b-5338-4d54-939f-6d636f34d14c-logs\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.404194 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.404220 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-config\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.404236 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-dns-svc\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.404254 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-625qw\" (UniqueName: \"kubernetes.io/projected/a1eb277b-5338-4d54-939f-6d636f34d14c-kube-api-access-625qw\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.404272 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc7p9\" (UniqueName: \"kubernetes.io/projected/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-kube-api-access-zc7p9\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.404291 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-ovsdbserver-nb\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.404310 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-config-data\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.404339 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-dns-swift-storage-0\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.404404 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-scripts\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.406826 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1eb277b-5338-4d54-939f-6d636f34d14c-logs\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.416016 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-combined-ca-bundle\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.416855 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-config-data\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.418490 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-scripts\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.432337 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hgsvg" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.435968 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-625qw\" (UniqueName: \"kubernetes.io/projected/a1eb277b-5338-4d54-939f-6d636f34d14c-kube-api-access-625qw\") pod \"placement-db-sync-gk28q\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.513490 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-db-sync-config-data\") pod \"794e875d-440e-4a04-acf1-61b1e63c57d7\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.513602 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-combined-ca-bundle\") pod \"794e875d-440e-4a04-acf1-61b1e63c57d7\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.513628 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-config-data\") pod \"794e875d-440e-4a04-acf1-61b1e63c57d7\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.513752 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95cl5\" (UniqueName: \"kubernetes.io/projected/794e875d-440e-4a04-acf1-61b1e63c57d7-kube-api-access-95cl5\") pod \"794e875d-440e-4a04-acf1-61b1e63c57d7\" (UID: \"794e875d-440e-4a04-acf1-61b1e63c57d7\") " Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.514008 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-log-httpd\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.514045 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjz5p\" (UniqueName: \"kubernetes.io/projected/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-kube-api-access-hjz5p\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.514095 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-run-httpd\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.514130 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.514152 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.514175 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-config\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.514193 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-dns-svc\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.514213 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc7p9\" (UniqueName: \"kubernetes.io/projected/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-kube-api-access-zc7p9\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.514232 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-ovsdbserver-nb\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.514251 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-config-data\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.514270 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-dns-swift-storage-0\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.514346 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-ovsdbserver-sb\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.514390 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-scripts\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.518032 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-config\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.518629 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-ovsdbserver-nb\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.519264 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-dns-svc\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.519556 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-ovsdbserver-sb\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.521070 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-dns-swift-storage-0\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.523018 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794e875d-440e-4a04-acf1-61b1e63c57d7-kube-api-access-95cl5" (OuterVolumeSpecName: "kube-api-access-95cl5") pod "794e875d-440e-4a04-acf1-61b1e63c57d7" (UID: "794e875d-440e-4a04-acf1-61b1e63c57d7"). InnerVolumeSpecName "kube-api-access-95cl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.528050 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "794e875d-440e-4a04-acf1-61b1e63c57d7" (UID: "794e875d-440e-4a04-acf1-61b1e63c57d7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.525011 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-run-httpd\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.526759 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.527432 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.524832 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-log-httpd\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.531534 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-scripts\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.546161 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc7p9\" (UniqueName: \"kubernetes.io/projected/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-kube-api-access-zc7p9\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.552732 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-config-data\") pod \"ceilometer-0\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.553242 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjz5p\" (UniqueName: \"kubernetes.io/projected/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-kube-api-access-hjz5p\") pod \"dnsmasq-dns-7644495549-mb5sr\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.579848 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "794e875d-440e-4a04-acf1-61b1e63c57d7" (UID: "794e875d-440e-4a04-acf1-61b1e63c57d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.616761 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.616795 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95cl5\" (UniqueName: \"kubernetes.io/projected/794e875d-440e-4a04-acf1-61b1e63c57d7-kube-api-access-95cl5\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.616806 4851 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.621654 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-config-data" (OuterVolumeSpecName: "config-data") pod "794e875d-440e-4a04-acf1-61b1e63c57d7" (UID: "794e875d-440e-4a04-acf1-61b1e63c57d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.641362 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gk28q" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.718615 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794e875d-440e-4a04-acf1-61b1e63c57d7-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.722239 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.729788 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.768461 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd86bf475-zh9k8"] Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.875776 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86559649f7-slxjz"] Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.910581 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-ddb6c45b5-ncjrz"] Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.928197 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b6cn9"] Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.943169 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9zwxh"] Oct 01 13:12:23 crc kubenswrapper[4851]: E1001 13:12:23.943641 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794e875d-440e-4a04-acf1-61b1e63c57d7" containerName="glance-db-sync" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.943660 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="794e875d-440e-4a04-acf1-61b1e63c57d7" containerName="glance-db-sync" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.943833 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="794e875d-440e-4a04-acf1-61b1e63c57d7" containerName="glance-db-sync" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.944454 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.946363 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ckz5d" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.947925 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9zwxh"] Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.947962 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 13:12:23 crc kubenswrapper[4851]: I1001 13:12:23.948154 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.022784 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-l9xpp"] Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.023967 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l9xpp" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.029175 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.029269 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-dpls5" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.030985 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-config-data\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.031094 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsbs8\" (UniqueName: \"kubernetes.io/projected/08f27829-7a4b-49d3-aed3-dbae56854228-kube-api-access-hsbs8\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.031203 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-db-sync-config-data\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.031271 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08f27829-7a4b-49d3-aed3-dbae56854228-etc-machine-id\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.031286 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-combined-ca-bundle\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.031324 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-scripts\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.043888 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-l9xpp"] Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.132483 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-db-sync-config-data\") pod \"barbican-db-sync-l9xpp\" (UID: \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\") " pod="openstack/barbican-db-sync-l9xpp" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.134606 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz85j\" (UniqueName: \"kubernetes.io/projected/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-kube-api-access-nz85j\") pod \"barbican-db-sync-l9xpp\" (UID: \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\") " pod="openstack/barbican-db-sync-l9xpp" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.134631 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-combined-ca-bundle\") pod \"barbican-db-sync-l9xpp\" (UID: \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\") " pod="openstack/barbican-db-sync-l9xpp" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.134689 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-db-sync-config-data\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.134862 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-combined-ca-bundle\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.134903 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08f27829-7a4b-49d3-aed3-dbae56854228-etc-machine-id\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.134970 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-scripts\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.135030 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-config-data\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.135159 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsbs8\" (UniqueName: \"kubernetes.io/projected/08f27829-7a4b-49d3-aed3-dbae56854228-kube-api-access-hsbs8\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.135454 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08f27829-7a4b-49d3-aed3-dbae56854228-etc-machine-id\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.143376 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-db-sync-config-data\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.146120 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-combined-ca-bundle\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.148382 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-scripts\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.152367 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-config-data\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.172038 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsbs8\" (UniqueName: \"kubernetes.io/projected/08f27829-7a4b-49d3-aed3-dbae56854228-kube-api-access-hsbs8\") pod \"cinder-db-sync-9zwxh\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.237596 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-db-sync-config-data\") pod \"barbican-db-sync-l9xpp\" (UID: \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\") " pod="openstack/barbican-db-sync-l9xpp" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.237644 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz85j\" (UniqueName: \"kubernetes.io/projected/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-kube-api-access-nz85j\") pod \"barbican-db-sync-l9xpp\" (UID: \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\") " pod="openstack/barbican-db-sync-l9xpp" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.237665 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-combined-ca-bundle\") pod \"barbican-db-sync-l9xpp\" (UID: \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\") " pod="openstack/barbican-db-sync-l9xpp" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.241810 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-db-sync-config-data\") pod \"barbican-db-sync-l9xpp\" (UID: \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\") " pod="openstack/barbican-db-sync-l9xpp" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.255641 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-combined-ca-bundle\") pod \"barbican-db-sync-l9xpp\" (UID: \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\") " pod="openstack/barbican-db-sync-l9xpp" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.268384 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz85j\" (UniqueName: \"kubernetes.io/projected/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-kube-api-access-nz85j\") pod \"barbican-db-sync-l9xpp\" (UID: \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\") " pod="openstack/barbican-db-sync-l9xpp" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.299126 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.313819 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gk28q"] Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.343133 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l9xpp" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.356513 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7644495549-mb5sr"] Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.372800 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-ddb6c45b5-ncjrz" event={"ID":"2ac095f6-4433-4c0e-9299-dcad17dff9fa","Type":"ContainerStarted","Data":"583508f75a92ef78d3baa331640d1923a76e2a860b71def8ca05dbee31bc2465"} Oct 01 13:12:24 crc kubenswrapper[4851]: W1001 13:12:24.373577 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2dd7fb2_21f4_4208_8944_f0e6c1f423a8.slice/crio-819def5d28c7523d62d8dece47b388b0be99987c0ee52d68bf9b894d9f02938d WatchSource:0}: Error finding container 819def5d28c7523d62d8dece47b388b0be99987c0ee52d68bf9b894d9f02938d: Status 404 returned error can't find the container with id 819def5d28c7523d62d8dece47b388b0be99987c0ee52d68bf9b894d9f02938d Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.376016 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b6cn9" event={"ID":"62c0ded5-8134-4f2e-bd27-0bf08dbe9226","Type":"ContainerStarted","Data":"3ce8c3f4069070fb48b679c33367c42a6c287375a7b2ae6b3ccf0645c2683bef"} Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.376175 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b6cn9" event={"ID":"62c0ded5-8134-4f2e-bd27-0bf08dbe9226","Type":"ContainerStarted","Data":"e4c911699d912cc7e5d30d8a17971fe612afa0d4e0e06587a1622856d34c3a26"} Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.380860 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86559649f7-slxjz" event={"ID":"9843bd36-0528-4494-9b3f-f1975e2b15f2","Type":"ContainerStarted","Data":"eb225048fab6e786ac8ac135ec5ab7489e7ab2194762457f9f1fd7e4d0a59547"} Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.387357 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hgsvg" event={"ID":"794e875d-440e-4a04-acf1-61b1e63c57d7","Type":"ContainerDied","Data":"790fedc8504529909c8fb9bd9937c173c0024321eb94789835a0ad3ff8bb6f04"} Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.387399 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="790fedc8504529909c8fb9bd9937c173c0024321eb94789835a0ad3ff8bb6f04" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.387401 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hgsvg" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.401138 4851 generic.go:334] "Generic (PLEG): container finished" podID="41001489-8c26-4e07-a72d-f4fe8d4fe636" containerID="95ec6d779bf8296eefb71738b9c7169613106a0cd14ab701c53bdaa8d14a6cb8" exitCode=0 Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.401183 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" event={"ID":"41001489-8c26-4e07-a72d-f4fe8d4fe636","Type":"ContainerDied","Data":"95ec6d779bf8296eefb71738b9c7169613106a0cd14ab701c53bdaa8d14a6cb8"} Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.401208 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" event={"ID":"41001489-8c26-4e07-a72d-f4fe8d4fe636","Type":"ContainerStarted","Data":"880e5ecb819be9e82bf9dcd7c8ad63ca1a85973fb8989f0e0eac964f3ad484b7"} Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.409594 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-b6cn9" podStartSLOduration=2.409575892 podStartE2EDuration="2.409575892s" podCreationTimestamp="2025-10-01 13:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:12:24.398324781 +0000 UTC m=+1152.743442267" watchObservedRunningTime="2025-10-01 13:12:24.409575892 +0000 UTC m=+1152.754693368" Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.476177 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.938699 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7644495549-mb5sr"] Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.971718 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d98d6dddf-lzbs9"] Oct 01 13:12:24 crc kubenswrapper[4851]: I1001 13:12:24.973402 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.002094 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d98d6dddf-lzbs9"] Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.020121 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-l9xpp"] Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.029957 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.061872 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9zwxh"] Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.064348 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-config\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.064390 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-778pm\" (UniqueName: \"kubernetes.io/projected/6b16af47-e71e-42f4-b45f-72afaaf0e13c-kube-api-access-778pm\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.064437 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-dns-svc\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.064458 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-dns-swift-storage-0\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.064474 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-ovsdbserver-nb\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.064519 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-ovsdbserver-sb\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.165134 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-dns-swift-storage-0\") pod \"41001489-8c26-4e07-a72d-f4fe8d4fe636\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.165516 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-config\") pod \"41001489-8c26-4e07-a72d-f4fe8d4fe636\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.165559 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-ovsdbserver-nb\") pod \"41001489-8c26-4e07-a72d-f4fe8d4fe636\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.165581 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42m72\" (UniqueName: \"kubernetes.io/projected/41001489-8c26-4e07-a72d-f4fe8d4fe636-kube-api-access-42m72\") pod \"41001489-8c26-4e07-a72d-f4fe8d4fe636\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.165634 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-dns-svc\") pod \"41001489-8c26-4e07-a72d-f4fe8d4fe636\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.165710 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-ovsdbserver-sb\") pod \"41001489-8c26-4e07-a72d-f4fe8d4fe636\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.167899 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-config\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.167938 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-778pm\" (UniqueName: \"kubernetes.io/projected/6b16af47-e71e-42f4-b45f-72afaaf0e13c-kube-api-access-778pm\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.167988 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-dns-svc\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.168011 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-dns-swift-storage-0\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.168028 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-ovsdbserver-nb\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.168061 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-ovsdbserver-sb\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.170760 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-ovsdbserver-sb\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.173405 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-dns-svc\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.173663 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-config\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.180131 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-dns-swift-storage-0\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.184453 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41001489-8c26-4e07-a72d-f4fe8d4fe636-kube-api-access-42m72" (OuterVolumeSpecName: "kube-api-access-42m72") pod "41001489-8c26-4e07-a72d-f4fe8d4fe636" (UID: "41001489-8c26-4e07-a72d-f4fe8d4fe636"). InnerVolumeSpecName "kube-api-access-42m72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.185448 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-ovsdbserver-nb\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.231428 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-config" (OuterVolumeSpecName: "config") pod "41001489-8c26-4e07-a72d-f4fe8d4fe636" (UID: "41001489-8c26-4e07-a72d-f4fe8d4fe636"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.243666 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "41001489-8c26-4e07-a72d-f4fe8d4fe636" (UID: "41001489-8c26-4e07-a72d-f4fe8d4fe636"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.244319 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "41001489-8c26-4e07-a72d-f4fe8d4fe636" (UID: "41001489-8c26-4e07-a72d-f4fe8d4fe636"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.254031 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-778pm\" (UniqueName: \"kubernetes.io/projected/6b16af47-e71e-42f4-b45f-72afaaf0e13c-kube-api-access-778pm\") pod \"dnsmasq-dns-d98d6dddf-lzbs9\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.271042 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "41001489-8c26-4e07-a72d-f4fe8d4fe636" (UID: "41001489-8c26-4e07-a72d-f4fe8d4fe636"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.271200 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-ovsdbserver-sb\") pod \"41001489-8c26-4e07-a72d-f4fe8d4fe636\" (UID: \"41001489-8c26-4e07-a72d-f4fe8d4fe636\") " Oct 01 13:12:25 crc kubenswrapper[4851]: W1001 13:12:25.271532 4851 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/41001489-8c26-4e07-a72d-f4fe8d4fe636/volumes/kubernetes.io~configmap/ovsdbserver-sb Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.271546 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "41001489-8c26-4e07-a72d-f4fe8d4fe636" (UID: "41001489-8c26-4e07-a72d-f4fe8d4fe636"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.271716 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "41001489-8c26-4e07-a72d-f4fe8d4fe636" (UID: "41001489-8c26-4e07-a72d-f4fe8d4fe636"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.272046 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.272063 4851 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.272076 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.272088 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.272101 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42m72\" (UniqueName: \"kubernetes.io/projected/41001489-8c26-4e07-a72d-f4fe8d4fe636-kube-api-access-42m72\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.272115 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41001489-8c26-4e07-a72d-f4fe8d4fe636-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.331352 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.368198 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86559649f7-slxjz"] Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.430730 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-857f448949-w9w5f"] Oct 01 13:12:25 crc kubenswrapper[4851]: E1001 13:12:25.438319 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41001489-8c26-4e07-a72d-f4fe8d4fe636" containerName="init" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.438358 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="41001489-8c26-4e07-a72d-f4fe8d4fe636" containerName="init" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.479777 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="41001489-8c26-4e07-a72d-f4fe8d4fe636" containerName="init" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.534232 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gk28q" event={"ID":"a1eb277b-5338-4d54-939f-6d636f34d14c","Type":"ContainerStarted","Data":"05712b7c6c410acd5bf063c6c7775e8bc4ed3a273cc93206c564ecf7561994a7"} Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.542051 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55afd90b-7461-4db7-89a6-d45f9bdcb1b3","Type":"ContainerStarted","Data":"c90495043976322ea2e0c36cea4c2c248b599763d47b7721f18e83c9e34f066a"} Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.534477 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.554145 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l9xpp" event={"ID":"d64a19f8-04aa-4c82-9818-6a50d9e3d62b","Type":"ContainerStarted","Data":"e4a120bcaef0c513819c5fcfbecbb7aa4b5227783ecfeffa0bf37bcbabd17c49"} Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.558774 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-857f448949-w9w5f"] Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.562789 4851 generic.go:334] "Generic (PLEG): container finished" podID="c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" containerID="2d568d26e666ffcfdb7ce853d057ac1935903f1866739cab250b3169cdc8467d" exitCode=0 Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.562883 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7644495549-mb5sr" event={"ID":"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8","Type":"ContainerDied","Data":"2d568d26e666ffcfdb7ce853d057ac1935903f1866739cab250b3169cdc8467d"} Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.562914 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7644495549-mb5sr" event={"ID":"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8","Type":"ContainerStarted","Data":"819def5d28c7523d62d8dece47b388b0be99987c0ee52d68bf9b894d9f02938d"} Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.566384 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9zwxh" event={"ID":"08f27829-7a4b-49d3-aed3-dbae56854228","Type":"ContainerStarted","Data":"9f6ad98752756c5455b537114f40e502ce4a1922ff7e5a3cce459c9490205b6f"} Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.567471 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.588031 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.588102 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd86bf475-zh9k8" event={"ID":"41001489-8c26-4e07-a72d-f4fe8d4fe636","Type":"ContainerDied","Data":"880e5ecb819be9e82bf9dcd7c8ad63ca1a85973fb8989f0e0eac964f3ad484b7"} Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.588312 4851 scope.go:117] "RemoveContainer" containerID="95ec6d779bf8296eefb71738b9c7169613106a0cd14ab701c53bdaa8d14a6cb8" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.590193 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.591932 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.595549 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-q4nbb" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.595942 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.599335 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.599810 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.608301 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.625264 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-horizon-secret-key\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.625330 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qldlg\" (UniqueName: \"kubernetes.io/projected/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-kube-api-access-qldlg\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.625360 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-logs\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.625383 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-config-data\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.625463 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-scripts\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.630851 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.636211 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.656165 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727114 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727158 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727177 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02522c45-cac6-496b-b1dc-83275b924b18-logs\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727222 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-horizon-secret-key\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727256 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02522c45-cac6-496b-b1dc-83275b924b18-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727270 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727311 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qldlg\" (UniqueName: \"kubernetes.io/projected/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-kube-api-access-qldlg\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727329 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727349 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-logs\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727372 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-config-data\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727392 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727411 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-scripts\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727427 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gvvc\" (UniqueName: \"kubernetes.io/projected/02522c45-cac6-496b-b1dc-83275b924b18-kube-api-access-4gvvc\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727458 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-config-data\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727484 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-logs\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727526 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-scripts\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727556 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727591 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.727622 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljkwf\" (UniqueName: \"kubernetes.io/projected/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-kube-api-access-ljkwf\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.731022 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-logs\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.733041 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-config-data\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.733889 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-scripts\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.738602 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-horizon-secret-key\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.751557 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd86bf475-zh9k8"] Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.770000 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qldlg\" (UniqueName: \"kubernetes.io/projected/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-kube-api-access-qldlg\") pod \"horizon-857f448949-w9w5f\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.770074 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fd86bf475-zh9k8"] Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.832375 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.832414 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-scripts\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.832434 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gvvc\" (UniqueName: \"kubernetes.io/projected/02522c45-cac6-496b-b1dc-83275b924b18-kube-api-access-4gvvc\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.832472 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-config-data\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.832519 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-logs\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.832553 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.832581 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.832604 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljkwf\" (UniqueName: \"kubernetes.io/projected/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-kube-api-access-ljkwf\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.832621 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.832642 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.832657 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02522c45-cac6-496b-b1dc-83275b924b18-logs\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.832694 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02522c45-cac6-496b-b1dc-83275b924b18-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.832710 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.832738 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.833874 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.834672 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-logs\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.836868 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.838683 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02522c45-cac6-496b-b1dc-83275b924b18-logs\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.839409 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.840308 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02522c45-cac6-496b-b1dc-83275b924b18-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.840336 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.845375 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.851407 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-config-data\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.855996 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.858863 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.859734 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-scripts\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.869145 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljkwf\" (UniqueName: \"kubernetes.io/projected/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-kube-api-access-ljkwf\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.869890 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gvvc\" (UniqueName: \"kubernetes.io/projected/02522c45-cac6-496b-b1dc-83275b924b18-kube-api-access-4gvvc\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.880716 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.905546 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " pod="openstack/glance-default-external-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.913679 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:12:25 crc kubenswrapper[4851]: I1001 13:12:25.981452 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:12:26 crc kubenswrapper[4851]: I1001 13:12:26.062804 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d98d6dddf-lzbs9"] Oct 01 13:12:26 crc kubenswrapper[4851]: I1001 13:12:26.215914 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:12:26 crc kubenswrapper[4851]: I1001 13:12:26.401918 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41001489-8c26-4e07-a72d-f4fe8d4fe636" path="/var/lib/kubelet/pods/41001489-8c26-4e07-a72d-f4fe8d4fe636/volumes" Oct 01 13:12:26 crc kubenswrapper[4851]: I1001 13:12:26.598956 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7644495549-mb5sr" event={"ID":"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8","Type":"ContainerStarted","Data":"15dc85017d3d1131995c575b70492d9e4b367ffab1b4d03f0e9ccef87684c838"} Oct 01 13:12:26 crc kubenswrapper[4851]: I1001 13:12:26.599359 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7644495549-mb5sr" podUID="c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" containerName="dnsmasq-dns" containerID="cri-o://15dc85017d3d1131995c575b70492d9e4b367ffab1b4d03f0e9ccef87684c838" gracePeriod=10 Oct 01 13:12:26 crc kubenswrapper[4851]: I1001 13:12:26.599669 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:26 crc kubenswrapper[4851]: I1001 13:12:26.629840 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7644495549-mb5sr" podStartSLOduration=3.62982038 podStartE2EDuration="3.62982038s" podCreationTimestamp="2025-10-01 13:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:12:26.629385378 +0000 UTC m=+1154.974502874" watchObservedRunningTime="2025-10-01 13:12:26.62982038 +0000 UTC m=+1154.974937866" Oct 01 13:12:26 crc kubenswrapper[4851]: I1001 13:12:26.647468 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" event={"ID":"6b16af47-e71e-42f4-b45f-72afaaf0e13c","Type":"ContainerStarted","Data":"715f94edf35f0a3770f7ffbb228c4f4679cc50d9edeab538080d032e5400826c"} Oct 01 13:12:26 crc kubenswrapper[4851]: I1001 13:12:26.743322 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-857f448949-w9w5f"] Oct 01 13:12:26 crc kubenswrapper[4851]: I1001 13:12:26.770007 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:12:26 crc kubenswrapper[4851]: W1001 13:12:26.772618 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d81c0d9_cf73_42b3_bc5a_69ca2699b8a7.slice/crio-46d61a47b6bcef4c3a31ac3ae4e8e2d8397605420f1fe3ad4f8326b01c5a2190 WatchSource:0}: Error finding container 46d61a47b6bcef4c3a31ac3ae4e8e2d8397605420f1fe3ad4f8326b01c5a2190: Status 404 returned error can't find the container with id 46d61a47b6bcef4c3a31ac3ae4e8e2d8397605420f1fe3ad4f8326b01c5a2190 Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.212577 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.246370 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.366527 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-ovsdbserver-sb\") pod \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.366610 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-dns-svc\") pod \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.366692 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-ovsdbserver-nb\") pod \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.366719 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjz5p\" (UniqueName: \"kubernetes.io/projected/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-kube-api-access-hjz5p\") pod \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.366781 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-dns-swift-storage-0\") pod \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.366803 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-config\") pod \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\" (UID: \"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8\") " Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.385947 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-kube-api-access-hjz5p" (OuterVolumeSpecName: "kube-api-access-hjz5p") pod "c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" (UID: "c2dd7fb2-21f4-4208-8944-f0e6c1f423a8"). InnerVolumeSpecName "kube-api-access-hjz5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.444417 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" (UID: "c2dd7fb2-21f4-4208-8944-f0e6c1f423a8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.471380 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjz5p\" (UniqueName: \"kubernetes.io/projected/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-kube-api-access-hjz5p\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.471409 4851 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.478013 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" (UID: "c2dd7fb2-21f4-4208-8944-f0e6c1f423a8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.478594 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-config" (OuterVolumeSpecName: "config") pod "c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" (UID: "c2dd7fb2-21f4-4208-8944-f0e6c1f423a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.479455 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" (UID: "c2dd7fb2-21f4-4208-8944-f0e6c1f423a8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.483987 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" (UID: "c2dd7fb2-21f4-4208-8944-f0e6c1f423a8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.575637 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.575670 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.575680 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.575689 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.664929 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02522c45-cac6-496b-b1dc-83275b924b18","Type":"ContainerStarted","Data":"30a02d3ef04d8380aa92ea2740377a5edc0ae204e3ffa0aeaddf8e98ca86a664"} Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.670011 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" event={"ID":"6b16af47-e71e-42f4-b45f-72afaaf0e13c","Type":"ContainerDied","Data":"6cda9ad7e176ad79970fdfa33bc29213f0e4717a74ee8af168ee70bec411f0d1"} Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.669874 4851 generic.go:334] "Generic (PLEG): container finished" podID="6b16af47-e71e-42f4-b45f-72afaaf0e13c" containerID="6cda9ad7e176ad79970fdfa33bc29213f0e4717a74ee8af168ee70bec411f0d1" exitCode=0 Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.671987 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-857f448949-w9w5f" event={"ID":"5d3e8ca0-bbef-4757-962b-68ea6f20af9a","Type":"ContainerStarted","Data":"79a580227f4b324e31e7113073a52f17b5fbae984c47d5ae9fa6244a6f53c1fa"} Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.675074 4851 generic.go:334] "Generic (PLEG): container finished" podID="c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" containerID="15dc85017d3d1131995c575b70492d9e4b367ffab1b4d03f0e9ccef87684c838" exitCode=0 Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.675117 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7644495549-mb5sr" event={"ID":"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8","Type":"ContainerDied","Data":"15dc85017d3d1131995c575b70492d9e4b367ffab1b4d03f0e9ccef87684c838"} Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.675136 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7644495549-mb5sr" event={"ID":"c2dd7fb2-21f4-4208-8944-f0e6c1f423a8","Type":"ContainerDied","Data":"819def5d28c7523d62d8dece47b388b0be99987c0ee52d68bf9b894d9f02938d"} Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.675151 4851 scope.go:117] "RemoveContainer" containerID="15dc85017d3d1131995c575b70492d9e4b367ffab1b4d03f0e9ccef87684c838" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.675238 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7644495549-mb5sr" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.709899 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7","Type":"ContainerStarted","Data":"46d61a47b6bcef4c3a31ac3ae4e8e2d8397605420f1fe3ad4f8326b01c5a2190"} Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.838661 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7644495549-mb5sr"] Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.843906 4851 scope.go:117] "RemoveContainer" containerID="2d568d26e666ffcfdb7ce853d057ac1935903f1866739cab250b3169cdc8467d" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.847716 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7644495549-mb5sr"] Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.916737 4851 scope.go:117] "RemoveContainer" containerID="15dc85017d3d1131995c575b70492d9e4b367ffab1b4d03f0e9ccef87684c838" Oct 01 13:12:27 crc kubenswrapper[4851]: E1001 13:12:27.918918 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15dc85017d3d1131995c575b70492d9e4b367ffab1b4d03f0e9ccef87684c838\": container with ID starting with 15dc85017d3d1131995c575b70492d9e4b367ffab1b4d03f0e9ccef87684c838 not found: ID does not exist" containerID="15dc85017d3d1131995c575b70492d9e4b367ffab1b4d03f0e9ccef87684c838" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.919294 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15dc85017d3d1131995c575b70492d9e4b367ffab1b4d03f0e9ccef87684c838"} err="failed to get container status \"15dc85017d3d1131995c575b70492d9e4b367ffab1b4d03f0e9ccef87684c838\": rpc error: code = NotFound desc = could not find container \"15dc85017d3d1131995c575b70492d9e4b367ffab1b4d03f0e9ccef87684c838\": container with ID starting with 15dc85017d3d1131995c575b70492d9e4b367ffab1b4d03f0e9ccef87684c838 not found: ID does not exist" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.919354 4851 scope.go:117] "RemoveContainer" containerID="2d568d26e666ffcfdb7ce853d057ac1935903f1866739cab250b3169cdc8467d" Oct 01 13:12:27 crc kubenswrapper[4851]: E1001 13:12:27.919986 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d568d26e666ffcfdb7ce853d057ac1935903f1866739cab250b3169cdc8467d\": container with ID starting with 2d568d26e666ffcfdb7ce853d057ac1935903f1866739cab250b3169cdc8467d not found: ID does not exist" containerID="2d568d26e666ffcfdb7ce853d057ac1935903f1866739cab250b3169cdc8467d" Oct 01 13:12:27 crc kubenswrapper[4851]: I1001 13:12:27.920024 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d568d26e666ffcfdb7ce853d057ac1935903f1866739cab250b3169cdc8467d"} err="failed to get container status \"2d568d26e666ffcfdb7ce853d057ac1935903f1866739cab250b3169cdc8467d\": rpc error: code = NotFound desc = could not find container \"2d568d26e666ffcfdb7ce853d057ac1935903f1866739cab250b3169cdc8467d\": container with ID starting with 2d568d26e666ffcfdb7ce853d057ac1935903f1866739cab250b3169cdc8467d not found: ID does not exist" Oct 01 13:12:28 crc kubenswrapper[4851]: I1001 13:12:28.339047 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" path="/var/lib/kubelet/pods/c2dd7fb2-21f4-4208-8944-f0e6c1f423a8/volumes" Oct 01 13:12:28 crc kubenswrapper[4851]: I1001 13:12:28.728969 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-f4s7j" event={"ID":"384e4797-6339-4742-9430-e87739c80e74","Type":"ContainerStarted","Data":"cb3639292db6b2c3698abda9b74a5ce1a47634af99a24fa9daf627631532a5d7"} Oct 01 13:12:28 crc kubenswrapper[4851]: I1001 13:12:28.749797 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-f4s7j" podStartSLOduration=10.042589262 podStartE2EDuration="40.749775054s" podCreationTimestamp="2025-10-01 13:11:48 +0000 UTC" firstStartedPulling="2025-10-01 13:11:56.792449396 +0000 UTC m=+1125.137566882" lastFinishedPulling="2025-10-01 13:12:27.499635188 +0000 UTC m=+1155.844752674" observedRunningTime="2025-10-01 13:12:28.746685416 +0000 UTC m=+1157.091802902" watchObservedRunningTime="2025-10-01 13:12:28.749775054 +0000 UTC m=+1157.094892550" Oct 01 13:12:28 crc kubenswrapper[4851]: I1001 13:12:28.753248 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7","Type":"ContainerStarted","Data":"bd68d31381d20fb45c7d0cf170937377640a94cb2ee1ec77d59b0165c23c3a88"} Oct 01 13:12:28 crc kubenswrapper[4851]: I1001 13:12:28.757106 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02522c45-cac6-496b-b1dc-83275b924b18","Type":"ContainerStarted","Data":"884f973bd7f32cb81005f66435625de6d609be575f825ef0e23fc17961a52f77"} Oct 01 13:12:28 crc kubenswrapper[4851]: I1001 13:12:28.766934 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" event={"ID":"6b16af47-e71e-42f4-b45f-72afaaf0e13c","Type":"ContainerStarted","Data":"3c334d0cc4506bef3d5cfc40c9419db0e3ef1d6b6268f6c032bc4f9a002a6599"} Oct 01 13:12:28 crc kubenswrapper[4851]: I1001 13:12:28.767308 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:28 crc kubenswrapper[4851]: I1001 13:12:28.792830 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" podStartSLOduration=4.7928142529999995 podStartE2EDuration="4.792814253s" podCreationTimestamp="2025-10-01 13:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:12:28.790753004 +0000 UTC m=+1157.135870510" watchObservedRunningTime="2025-10-01 13:12:28.792814253 +0000 UTC m=+1157.137931739" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.234711 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fvbs7"] Oct 01 13:12:29 crc kubenswrapper[4851]: E1001 13:12:29.235367 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" containerName="init" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.235380 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" containerName="init" Oct 01 13:12:29 crc kubenswrapper[4851]: E1001 13:12:29.235395 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" containerName="dnsmasq-dns" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.235401 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" containerName="dnsmasq-dns" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.235611 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2dd7fb2-21f4-4208-8944-f0e6c1f423a8" containerName="dnsmasq-dns" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.236608 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fvbs7" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.240932 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.241043 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.240935 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sq998" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.265316 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fvbs7"] Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.335452 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc29451-e27b-4bd0-9a5f-6a177e7621be-combined-ca-bundle\") pod \"neutron-db-sync-fvbs7\" (UID: \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\") " pod="openstack/neutron-db-sync-fvbs7" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.335509 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bfc29451-e27b-4bd0-9a5f-6a177e7621be-config\") pod \"neutron-db-sync-fvbs7\" (UID: \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\") " pod="openstack/neutron-db-sync-fvbs7" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.335644 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll4rb\" (UniqueName: \"kubernetes.io/projected/bfc29451-e27b-4bd0-9a5f-6a177e7621be-kube-api-access-ll4rb\") pod \"neutron-db-sync-fvbs7\" (UID: \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\") " pod="openstack/neutron-db-sync-fvbs7" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.437319 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll4rb\" (UniqueName: \"kubernetes.io/projected/bfc29451-e27b-4bd0-9a5f-6a177e7621be-kube-api-access-ll4rb\") pod \"neutron-db-sync-fvbs7\" (UID: \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\") " pod="openstack/neutron-db-sync-fvbs7" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.437378 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc29451-e27b-4bd0-9a5f-6a177e7621be-combined-ca-bundle\") pod \"neutron-db-sync-fvbs7\" (UID: \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\") " pod="openstack/neutron-db-sync-fvbs7" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.437399 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bfc29451-e27b-4bd0-9a5f-6a177e7621be-config\") pod \"neutron-db-sync-fvbs7\" (UID: \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\") " pod="openstack/neutron-db-sync-fvbs7" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.443748 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc29451-e27b-4bd0-9a5f-6a177e7621be-combined-ca-bundle\") pod \"neutron-db-sync-fvbs7\" (UID: \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\") " pod="openstack/neutron-db-sync-fvbs7" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.444799 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bfc29451-e27b-4bd0-9a5f-6a177e7621be-config\") pod \"neutron-db-sync-fvbs7\" (UID: \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\") " pod="openstack/neutron-db-sync-fvbs7" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.453450 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll4rb\" (UniqueName: \"kubernetes.io/projected/bfc29451-e27b-4bd0-9a5f-6a177e7621be-kube-api-access-ll4rb\") pod \"neutron-db-sync-fvbs7\" (UID: \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\") " pod="openstack/neutron-db-sync-fvbs7" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.565036 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fvbs7" Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.779589 4851 generic.go:334] "Generic (PLEG): container finished" podID="62c0ded5-8134-4f2e-bd27-0bf08dbe9226" containerID="3ce8c3f4069070fb48b679c33367c42a6c287375a7b2ae6b3ccf0645c2683bef" exitCode=0 Oct 01 13:12:29 crc kubenswrapper[4851]: I1001 13:12:29.779663 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b6cn9" event={"ID":"62c0ded5-8134-4f2e-bd27-0bf08dbe9226","Type":"ContainerDied","Data":"3ce8c3f4069070fb48b679c33367c42a6c287375a7b2ae6b3ccf0645c2683bef"} Oct 01 13:12:30 crc kubenswrapper[4851]: I1001 13:12:30.049994 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:12:30 crc kubenswrapper[4851]: I1001 13:12:30.050059 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:12:31 crc kubenswrapper[4851]: I1001 13:12:31.816381 4851 generic.go:334] "Generic (PLEG): container finished" podID="384e4797-6339-4742-9430-e87739c80e74" containerID="cb3639292db6b2c3698abda9b74a5ce1a47634af99a24fa9daf627631532a5d7" exitCode=0 Oct 01 13:12:31 crc kubenswrapper[4851]: I1001 13:12:31.816455 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-f4s7j" event={"ID":"384e4797-6339-4742-9430-e87739c80e74","Type":"ContainerDied","Data":"cb3639292db6b2c3698abda9b74a5ce1a47634af99a24fa9daf627631532a5d7"} Oct 01 13:12:34 crc kubenswrapper[4851]: I1001 13:12:34.578334 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:12:34 crc kubenswrapper[4851]: I1001 13:12:34.645602 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.333445 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.400839 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6747f4f9-jm2gc"] Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.401099 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" podUID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerName="dnsmasq-dns" containerID="cri-o://0a5516bbe7b63d7f5a6de326a8acefedb9dac9702050318a7689407ae1994a0a" gracePeriod=10 Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.483208 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-ddb6c45b5-ncjrz"] Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.508024 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67b666754b-b52ns"] Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.509507 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.512013 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.519415 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67b666754b-b52ns"] Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.597588 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-857f448949-w9w5f"] Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.621127 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58c9859d68-bckn5"] Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.622679 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.656944 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58c9859d68-bckn5"] Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.685254 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1db63449-71cb-4fa2-86db-43a83a914643-config-data\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.685479 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-combined-ca-bundle\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.685584 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8v56\" (UniqueName: \"kubernetes.io/projected/1db63449-71cb-4fa2-86db-43a83a914643-kube-api-access-l8v56\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.685615 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-horizon-tls-certs\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.685737 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1db63449-71cb-4fa2-86db-43a83a914643-scripts\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.685856 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-horizon-secret-key\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.685952 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1db63449-71cb-4fa2-86db-43a83a914643-logs\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.787389 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-horizon-secret-key\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.787428 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fcf93f8-06db-4cab-8699-9051ca2ae50a-logs\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.787449 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fcf93f8-06db-4cab-8699-9051ca2ae50a-horizon-tls-certs\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.787511 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1db63449-71cb-4fa2-86db-43a83a914643-logs\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.787528 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fcf93f8-06db-4cab-8699-9051ca2ae50a-combined-ca-bundle\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.787580 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1db63449-71cb-4fa2-86db-43a83a914643-config-data\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.787613 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-combined-ca-bundle\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.787642 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8v56\" (UniqueName: \"kubernetes.io/projected/1db63449-71cb-4fa2-86db-43a83a914643-kube-api-access-l8v56\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.787660 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fcf93f8-06db-4cab-8699-9051ca2ae50a-scripts\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.787677 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-horizon-tls-certs\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.787692 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2fcf93f8-06db-4cab-8699-9051ca2ae50a-config-data\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.787714 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2fcf93f8-06db-4cab-8699-9051ca2ae50a-horizon-secret-key\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.787730 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6tt9\" (UniqueName: \"kubernetes.io/projected/2fcf93f8-06db-4cab-8699-9051ca2ae50a-kube-api-access-c6tt9\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.787770 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1db63449-71cb-4fa2-86db-43a83a914643-scripts\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.788384 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1db63449-71cb-4fa2-86db-43a83a914643-scripts\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.789462 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1db63449-71cb-4fa2-86db-43a83a914643-logs\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.790076 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1db63449-71cb-4fa2-86db-43a83a914643-config-data\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.795762 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-combined-ca-bundle\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.796834 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-horizon-tls-certs\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.814327 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-horizon-secret-key\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.819085 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8v56\" (UniqueName: \"kubernetes.io/projected/1db63449-71cb-4fa2-86db-43a83a914643-kube-api-access-l8v56\") pod \"horizon-67b666754b-b52ns\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.864719 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.889920 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fcf93f8-06db-4cab-8699-9051ca2ae50a-logs\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.889960 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fcf93f8-06db-4cab-8699-9051ca2ae50a-horizon-tls-certs\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.890008 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fcf93f8-06db-4cab-8699-9051ca2ae50a-combined-ca-bundle\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.890083 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fcf93f8-06db-4cab-8699-9051ca2ae50a-scripts\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.890109 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2fcf93f8-06db-4cab-8699-9051ca2ae50a-config-data\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.890133 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2fcf93f8-06db-4cab-8699-9051ca2ae50a-horizon-secret-key\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.890147 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6tt9\" (UniqueName: \"kubernetes.io/projected/2fcf93f8-06db-4cab-8699-9051ca2ae50a-kube-api-access-c6tt9\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.890982 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fcf93f8-06db-4cab-8699-9051ca2ae50a-logs\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.891220 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fcf93f8-06db-4cab-8699-9051ca2ae50a-scripts\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.892036 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2fcf93f8-06db-4cab-8699-9051ca2ae50a-config-data\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.898236 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fcf93f8-06db-4cab-8699-9051ca2ae50a-combined-ca-bundle\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.900046 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2fcf93f8-06db-4cab-8699-9051ca2ae50a-horizon-secret-key\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.900814 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fcf93f8-06db-4cab-8699-9051ca2ae50a-horizon-tls-certs\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.946977 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6tt9\" (UniqueName: \"kubernetes.io/projected/2fcf93f8-06db-4cab-8699-9051ca2ae50a-kube-api-access-c6tt9\") pod \"horizon-58c9859d68-bckn5\" (UID: \"2fcf93f8-06db-4cab-8699-9051ca2ae50a\") " pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:35 crc kubenswrapper[4851]: I1001 13:12:35.953053 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:12:36 crc kubenswrapper[4851]: I1001 13:12:36.864914 4851 generic.go:334] "Generic (PLEG): container finished" podID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerID="0a5516bbe7b63d7f5a6de326a8acefedb9dac9702050318a7689407ae1994a0a" exitCode=0 Oct 01 13:12:36 crc kubenswrapper[4851]: I1001 13:12:36.864974 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" event={"ID":"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1","Type":"ContainerDied","Data":"0a5516bbe7b63d7f5a6de326a8acefedb9dac9702050318a7689407ae1994a0a"} Oct 01 13:12:38 crc kubenswrapper[4851]: I1001 13:12:38.463847 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" podUID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Oct 01 13:12:39 crc kubenswrapper[4851]: I1001 13:12:39.901570 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7","Type":"ContainerStarted","Data":"e7f0766ad644552ba0536fb5d35b86be8b435b6b09360208aea4186770f40dae"} Oct 01 13:12:41 crc kubenswrapper[4851]: I1001 13:12:41.923266 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" containerName="glance-httpd" containerID="cri-o://e7f0766ad644552ba0536fb5d35b86be8b435b6b09360208aea4186770f40dae" gracePeriod=30 Oct 01 13:12:41 crc kubenswrapper[4851]: I1001 13:12:41.923809 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" containerName="glance-log" containerID="cri-o://bd68d31381d20fb45c7d0cf170937377640a94cb2ee1ec77d59b0165c23c3a88" gracePeriod=30 Oct 01 13:12:41 crc kubenswrapper[4851]: I1001 13:12:41.960137 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=16.960072345 podStartE2EDuration="16.960072345s" podCreationTimestamp="2025-10-01 13:12:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:12:41.952687394 +0000 UTC m=+1170.297804900" watchObservedRunningTime="2025-10-01 13:12:41.960072345 +0000 UTC m=+1170.305189871" Oct 01 13:12:42 crc kubenswrapper[4851]: I1001 13:12:42.951054 4851 generic.go:334] "Generic (PLEG): container finished" podID="0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" containerID="bd68d31381d20fb45c7d0cf170937377640a94cb2ee1ec77d59b0165c23c3a88" exitCode=143 Oct 01 13:12:42 crc kubenswrapper[4851]: I1001 13:12:42.951333 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7","Type":"ContainerDied","Data":"bd68d31381d20fb45c7d0cf170937377640a94cb2ee1ec77d59b0165c23c3a88"} Oct 01 13:12:43 crc kubenswrapper[4851]: I1001 13:12:43.464787 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" podUID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Oct 01 13:12:43 crc kubenswrapper[4851]: I1001 13:12:43.967052 4851 generic.go:334] "Generic (PLEG): container finished" podID="0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" containerID="e7f0766ad644552ba0536fb5d35b86be8b435b6b09360208aea4186770f40dae" exitCode=0 Oct 01 13:12:43 crc kubenswrapper[4851]: I1001 13:12:43.967127 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7","Type":"ContainerDied","Data":"e7f0766ad644552ba0536fb5d35b86be8b435b6b09360208aea4186770f40dae"} Oct 01 13:12:43 crc kubenswrapper[4851]: E1001 13:12:43.993282 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Oct 01 13:12:43 crc kubenswrapper[4851]: E1001 13:12:43.993395 4851 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Oct 01 13:12:43 crc kubenswrapper[4851]: E1001 13:12:43.993685 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:38.102.83.36:5001/podified-master-centos10/openstack-placement-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-625qw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-gk28q_openstack(a1eb277b-5338-4d54-939f-6d636f34d14c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:12:43 crc kubenswrapper[4851]: E1001 13:12:43.995036 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-gk28q" podUID="a1eb277b-5338-4d54-939f-6d636f34d14c" Oct 01 13:12:44 crc kubenswrapper[4851]: E1001 13:12:44.978825 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/podified-master-centos10/openstack-placement-api:watcher_latest\\\"\"" pod="openstack/placement-db-sync-gk28q" podUID="a1eb277b-5338-4d54-939f-6d636f34d14c" Oct 01 13:12:48 crc kubenswrapper[4851]: I1001 13:12:48.464106 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" podUID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Oct 01 13:12:48 crc kubenswrapper[4851]: I1001 13:12:48.464916 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.463434 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" podUID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Oct 01 13:12:53 crc kubenswrapper[4851]: E1001 13:12:53.484555 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 01 13:12:53 crc kubenswrapper[4851]: E1001 13:12:53.484650 4851 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 01 13:12:53 crc kubenswrapper[4851]: E1001 13:12:53.484807 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.36:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58ch5c8h5d5h584h697h76hc6h599h57h646h667h579h657h599h5bhd5h576h64h6fh68bhdch596h59fh55h668h69h5b9h5fdh78h5bch686h64q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5pmds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-86559649f7-slxjz_openstack(9843bd36-0528-4494-9b3f-f1975e2b15f2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:12:53 crc kubenswrapper[4851]: E1001 13:12:53.496056 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 01 13:12:53 crc kubenswrapper[4851]: E1001 13:12:53.496144 4851 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 01 13:12:53 crc kubenswrapper[4851]: E1001 13:12:53.496335 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.36:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nc6h584h559h5d8h665h9dh565h697h99hddh5c9h5c8h5b7h6fh5dh9fh5dfhdhbch697h596h557h7bh86h5cfh9h7dh5cbh685h65bh55fh54q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xs4rg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-ddb6c45b5-ncjrz_openstack(2ac095f6-4433-4c0e-9299-dcad17dff9fa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:12:53 crc kubenswrapper[4851]: E1001 13:12:53.502399 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-86559649f7-slxjz" podUID="9843bd36-0528-4494-9b3f-f1975e2b15f2" Oct 01 13:12:53 crc kubenswrapper[4851]: E1001 13:12:53.502588 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-ddb6c45b5-ncjrz" podUID="2ac095f6-4433-4c0e-9299-dcad17dff9fa" Oct 01 13:12:53 crc kubenswrapper[4851]: E1001 13:12:53.537765 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 01 13:12:53 crc kubenswrapper[4851]: E1001 13:12:53.537829 4851 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 01 13:12:53 crc kubenswrapper[4851]: E1001 13:12:53.537957 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.36:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndch555h5c4h684h65dhb9h5fch5fbh56h68dh64ch55ch6dh69h66bh68chcchf4hdh95h95hbfhf9h566h5bch5b6h5b8h67fh55h66fhfbhddq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qldlg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-857f448949-w9w5f_openstack(5d3e8ca0-bbef-4757-962b-68ea6f20af9a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:12:53 crc kubenswrapper[4851]: E1001 13:12:53.541731 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-857f448949-w9w5f" podUID="5d3e8ca0-bbef-4757-962b-68ea6f20af9a" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.602292 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.620053 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.666841 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-config-data\") pod \"384e4797-6339-4742-9430-e87739c80e74\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.666911 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6tft\" (UniqueName: \"kubernetes.io/projected/384e4797-6339-4742-9430-e87739c80e74-kube-api-access-l6tft\") pod \"384e4797-6339-4742-9430-e87739c80e74\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.666965 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-credential-keys\") pod \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.667832 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch26x\" (UniqueName: \"kubernetes.io/projected/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-kube-api-access-ch26x\") pod \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.667874 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-fernet-keys\") pod \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.667907 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-scripts\") pod \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.667994 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-combined-ca-bundle\") pod \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.668066 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-combined-ca-bundle\") pod \"384e4797-6339-4742-9430-e87739c80e74\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.668135 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-config-data\") pod \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\" (UID: \"62c0ded5-8134-4f2e-bd27-0bf08dbe9226\") " Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.668224 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-db-sync-config-data\") pod \"384e4797-6339-4742-9430-e87739c80e74\" (UID: \"384e4797-6339-4742-9430-e87739c80e74\") " Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.674596 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "62c0ded5-8134-4f2e-bd27-0bf08dbe9226" (UID: "62c0ded5-8134-4f2e-bd27-0bf08dbe9226"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.675181 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "62c0ded5-8134-4f2e-bd27-0bf08dbe9226" (UID: "62c0ded5-8134-4f2e-bd27-0bf08dbe9226"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.675724 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-scripts" (OuterVolumeSpecName: "scripts") pod "62c0ded5-8134-4f2e-bd27-0bf08dbe9226" (UID: "62c0ded5-8134-4f2e-bd27-0bf08dbe9226"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.677179 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-kube-api-access-ch26x" (OuterVolumeSpecName: "kube-api-access-ch26x") pod "62c0ded5-8134-4f2e-bd27-0bf08dbe9226" (UID: "62c0ded5-8134-4f2e-bd27-0bf08dbe9226"). InnerVolumeSpecName "kube-api-access-ch26x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.710730 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "384e4797-6339-4742-9430-e87739c80e74" (UID: "384e4797-6339-4742-9430-e87739c80e74"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.714085 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "384e4797-6339-4742-9430-e87739c80e74" (UID: "384e4797-6339-4742-9430-e87739c80e74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.721381 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/384e4797-6339-4742-9430-e87739c80e74-kube-api-access-l6tft" (OuterVolumeSpecName: "kube-api-access-l6tft") pod "384e4797-6339-4742-9430-e87739c80e74" (UID: "384e4797-6339-4742-9430-e87739c80e74"). InnerVolumeSpecName "kube-api-access-l6tft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.731316 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-config-data" (OuterVolumeSpecName: "config-data") pod "62c0ded5-8134-4f2e-bd27-0bf08dbe9226" (UID: "62c0ded5-8134-4f2e-bd27-0bf08dbe9226"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.738190 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62c0ded5-8134-4f2e-bd27-0bf08dbe9226" (UID: "62c0ded5-8134-4f2e-bd27-0bf08dbe9226"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.765918 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-config-data" (OuterVolumeSpecName: "config-data") pod "384e4797-6339-4742-9430-e87739c80e74" (UID: "384e4797-6339-4742-9430-e87739c80e74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.770999 4851 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.771035 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch26x\" (UniqueName: \"kubernetes.io/projected/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-kube-api-access-ch26x\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.771052 4851 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.771063 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.771077 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.771089 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.771099 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c0ded5-8134-4f2e-bd27-0bf08dbe9226-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.771110 4851 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.771122 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384e4797-6339-4742-9430-e87739c80e74-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:53 crc kubenswrapper[4851]: I1001 13:12:53.771134 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6tft\" (UniqueName: \"kubernetes.io/projected/384e4797-6339-4742-9430-e87739c80e74-kube-api-access-l6tft\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.088343 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-f4s7j" event={"ID":"384e4797-6339-4742-9430-e87739c80e74","Type":"ContainerDied","Data":"58a76a9e7cd97418625e8cbdc573f82c4c3dd61fac2d73395c41aaaed82471fa"} Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.088391 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58a76a9e7cd97418625e8cbdc573f82c4c3dd61fac2d73395c41aaaed82471fa" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.088357 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-f4s7j" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.089698 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b6cn9" event={"ID":"62c0ded5-8134-4f2e-bd27-0bf08dbe9226","Type":"ContainerDied","Data":"e4c911699d912cc7e5d30d8a17971fe612afa0d4e0e06587a1622856d34c3a26"} Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.089725 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4c911699d912cc7e5d30d8a17971fe612afa0d4e0e06587a1622856d34c3a26" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.089761 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b6cn9" Oct 01 13:12:54 crc kubenswrapper[4851]: E1001 13:12:54.199125 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Oct 01 13:12:54 crc kubenswrapper[4851]: E1001 13:12:54.199423 4851 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Oct 01 13:12:54 crc kubenswrapper[4851]: E1001 13:12:54.199558 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:38.102.83.36:5001/podified-master-centos10/openstack-barbican-api:watcher_latest,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nz85j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-l9xpp_openstack(d64a19f8-04aa-4c82-9818-6a50d9e3d62b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:12:54 crc kubenswrapper[4851]: E1001 13:12:54.201283 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-l9xpp" podUID="d64a19f8-04aa-4c82-9818-6a50d9e3d62b" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.784808 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-b6cn9"] Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.791479 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-b6cn9"] Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.909709 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 01 13:12:54 crc kubenswrapper[4851]: E1001 13:12:54.910243 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384e4797-6339-4742-9430-e87739c80e74" containerName="watcher-db-sync" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.910266 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="384e4797-6339-4742-9430-e87739c80e74" containerName="watcher-db-sync" Oct 01 13:12:54 crc kubenswrapper[4851]: E1001 13:12:54.910290 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c0ded5-8134-4f2e-bd27-0bf08dbe9226" containerName="keystone-bootstrap" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.910299 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c0ded5-8134-4f2e-bd27-0bf08dbe9226" containerName="keystone-bootstrap" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.910516 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="384e4797-6339-4742-9430-e87739c80e74" containerName="watcher-db-sync" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.910549 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c0ded5-8134-4f2e-bd27-0bf08dbe9226" containerName="keystone-bootstrap" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.922415 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.925065 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-wwjvv" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.925440 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.933248 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5s4gk"] Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.948122 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.950350 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vxcc4" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.950618 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.950836 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.951168 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.952915 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 01 13:12:54 crc kubenswrapper[4851]: I1001 13:12:54.996296 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5s4gk"] Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.020051 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.022061 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.025108 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.029959 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971ca0ac-6de7-42f1-bf29-5174fd80ced4-config-data\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.030025 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-scripts\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.030076 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khlpn\" (UniqueName: \"kubernetes.io/projected/971ca0ac-6de7-42f1-bf29-5174fd80ced4-kube-api-access-khlpn\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.030095 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971ca0ac-6de7-42f1-bf29-5174fd80ced4-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.030130 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bflkj\" (UniqueName: \"kubernetes.io/projected/9adae7cd-3c60-49d2-9048-138c3050d6f4-kube-api-access-bflkj\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.030160 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/971ca0ac-6de7-42f1-bf29-5174fd80ced4-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.030177 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-fernet-keys\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.030201 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-config-data\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.030237 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-credential-keys\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.030261 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-combined-ca-bundle\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.030278 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971ca0ac-6de7-42f1-bf29-5174fd80ced4-logs\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.032709 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.034690 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.037531 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.071114 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.079471 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:12:55 crc kubenswrapper[4851]: E1001 13:12:55.099231 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/podified-master-centos10/openstack-barbican-api:watcher_latest\\\"\"" pod="openstack/barbican-db-sync-l9xpp" podUID="d64a19f8-04aa-4c82-9818-6a50d9e3d62b" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131364 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bflkj\" (UniqueName: \"kubernetes.io/projected/9adae7cd-3c60-49d2-9048-138c3050d6f4-kube-api-access-bflkj\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131432 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/971ca0ac-6de7-42f1-bf29-5174fd80ced4-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131457 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-fernet-keys\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131510 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-config-data\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131561 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-config-data\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131591 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-credential-keys\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131613 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cad65438-644c-45b7-8267-370126fe6aef-config-data\") pod \"watcher-applier-0\" (UID: \"cad65438-644c-45b7-8267-370126fe6aef\") " pod="openstack/watcher-applier-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131647 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-combined-ca-bundle\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131668 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cad65438-644c-45b7-8267-370126fe6aef-logs\") pod \"watcher-applier-0\" (UID: \"cad65438-644c-45b7-8267-370126fe6aef\") " pod="openstack/watcher-applier-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131693 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971ca0ac-6de7-42f1-bf29-5174fd80ced4-logs\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131730 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131755 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971ca0ac-6de7-42f1-bf29-5174fd80ced4-config-data\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131783 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cad65438-644c-45b7-8267-370126fe6aef-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"cad65438-644c-45b7-8267-370126fe6aef\") " pod="openstack/watcher-applier-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131808 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d713b6ba-8532-4b3c-aa32-c97d5547ea62-logs\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131834 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8hdv\" (UniqueName: \"kubernetes.io/projected/d713b6ba-8532-4b3c-aa32-c97d5547ea62-kube-api-access-x8hdv\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131866 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-scripts\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131904 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131931 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khlpn\" (UniqueName: \"kubernetes.io/projected/971ca0ac-6de7-42f1-bf29-5174fd80ced4-kube-api-access-khlpn\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131957 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971ca0ac-6de7-42f1-bf29-5174fd80ced4-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.131995 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs95w\" (UniqueName: \"kubernetes.io/projected/cad65438-644c-45b7-8267-370126fe6aef-kube-api-access-xs95w\") pod \"watcher-applier-0\" (UID: \"cad65438-644c-45b7-8267-370126fe6aef\") " pod="openstack/watcher-applier-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.133818 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971ca0ac-6de7-42f1-bf29-5174fd80ced4-logs\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.138079 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-scripts\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.138688 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971ca0ac-6de7-42f1-bf29-5174fd80ced4-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.140172 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-fernet-keys\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.140308 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971ca0ac-6de7-42f1-bf29-5174fd80ced4-config-data\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.141988 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-combined-ca-bundle\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.142206 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/971ca0ac-6de7-42f1-bf29-5174fd80ced4-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.145188 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-credential-keys\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.150012 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khlpn\" (UniqueName: \"kubernetes.io/projected/971ca0ac-6de7-42f1-bf29-5174fd80ced4-kube-api-access-khlpn\") pod \"watcher-decision-engine-0\" (UID: \"971ca0ac-6de7-42f1-bf29-5174fd80ced4\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.152998 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-config-data\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.154634 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bflkj\" (UniqueName: \"kubernetes.io/projected/9adae7cd-3c60-49d2-9048-138c3050d6f4-kube-api-access-bflkj\") pod \"keystone-bootstrap-5s4gk\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.234319 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.235064 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs95w\" (UniqueName: \"kubernetes.io/projected/cad65438-644c-45b7-8267-370126fe6aef-kube-api-access-xs95w\") pod \"watcher-applier-0\" (UID: \"cad65438-644c-45b7-8267-370126fe6aef\") " pod="openstack/watcher-applier-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.235485 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-config-data\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.235675 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cad65438-644c-45b7-8267-370126fe6aef-config-data\") pod \"watcher-applier-0\" (UID: \"cad65438-644c-45b7-8267-370126fe6aef\") " pod="openstack/watcher-applier-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.235789 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cad65438-644c-45b7-8267-370126fe6aef-logs\") pod \"watcher-applier-0\" (UID: \"cad65438-644c-45b7-8267-370126fe6aef\") " pod="openstack/watcher-applier-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.235940 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.236044 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cad65438-644c-45b7-8267-370126fe6aef-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"cad65438-644c-45b7-8267-370126fe6aef\") " pod="openstack/watcher-applier-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.236169 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d713b6ba-8532-4b3c-aa32-c97d5547ea62-logs\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.236369 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8hdv\" (UniqueName: \"kubernetes.io/projected/d713b6ba-8532-4b3c-aa32-c97d5547ea62-kube-api-access-x8hdv\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.241279 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d713b6ba-8532-4b3c-aa32-c97d5547ea62-logs\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.241292 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cad65438-644c-45b7-8267-370126fe6aef-logs\") pod \"watcher-applier-0\" (UID: \"cad65438-644c-45b7-8267-370126fe6aef\") " pod="openstack/watcher-applier-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.245979 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.246680 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.249790 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-config-data\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.254326 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.254516 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cad65438-644c-45b7-8267-370126fe6aef-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"cad65438-644c-45b7-8267-370126fe6aef\") " pod="openstack/watcher-applier-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.257257 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cad65438-644c-45b7-8267-370126fe6aef-config-data\") pod \"watcher-applier-0\" (UID: \"cad65438-644c-45b7-8267-370126fe6aef\") " pod="openstack/watcher-applier-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.257979 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8hdv\" (UniqueName: \"kubernetes.io/projected/d713b6ba-8532-4b3c-aa32-c97d5547ea62-kube-api-access-x8hdv\") pod \"watcher-api-0\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.260721 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs95w\" (UniqueName: \"kubernetes.io/projected/cad65438-644c-45b7-8267-370126fe6aef-kube-api-access-xs95w\") pod \"watcher-applier-0\" (UID: \"cad65438-644c-45b7-8267-370126fe6aef\") " pod="openstack/watcher-applier-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.277865 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.350255 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.370047 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.982169 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 13:12:55 crc kubenswrapper[4851]: I1001 13:12:55.982214 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 13:12:56 crc kubenswrapper[4851]: I1001 13:12:56.339636 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c0ded5-8134-4f2e-bd27-0bf08dbe9226" path="/var/lib/kubelet/pods/62c0ded5-8134-4f2e-bd27-0bf08dbe9226/volumes" Oct 01 13:13:00 crc kubenswrapper[4851]: I1001 13:13:00.050555 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:13:00 crc kubenswrapper[4851]: I1001 13:13:00.051039 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:13:00 crc kubenswrapper[4851]: I1001 13:13:00.051101 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 13:13:07 crc kubenswrapper[4851]: I1001 13:13:00.051978 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"91826d492b7e20fab5770efae0e5337ce96401b4dbf9f9356c89538e943aab30"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:13:07 crc kubenswrapper[4851]: I1001 13:13:00.052065 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://91826d492b7e20fab5770efae0e5337ce96401b4dbf9f9356c89538e943aab30" gracePeriod=600 Oct 01 13:13:07 crc kubenswrapper[4851]: I1001 13:13:03.463658 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" podUID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: i/o timeout" Oct 01 13:13:08 crc kubenswrapper[4851]: I1001 13:13:08.253424 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="91826d492b7e20fab5770efae0e5337ce96401b4dbf9f9356c89538e943aab30" exitCode=0 Oct 01 13:13:08 crc kubenswrapper[4851]: I1001 13:13:08.253519 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"91826d492b7e20fab5770efae0e5337ce96401b4dbf9f9356c89538e943aab30"} Oct 01 13:13:08 crc kubenswrapper[4851]: I1001 13:13:08.254026 4851 scope.go:117] "RemoveContainer" containerID="be5d6b868e9238c5d4395c014452c5cfe7dc87bf6a9741e8af0bded2d6b25de6" Oct 01 13:13:08 crc kubenswrapper[4851]: I1001 13:13:08.464542 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" podUID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: i/o timeout" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.229493 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.234676 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.265834 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-scripts\") pod \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.265956 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-dns-swift-storage-0\") pod \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.266024 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-ovsdbserver-nb\") pod \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.266129 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-combined-ca-bundle\") pod \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.266936 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-logs\") pod \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.267028 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.267106 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-config-data\") pod \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.267188 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-dns-svc\") pod \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.267286 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-config\") pod \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.267337 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-httpd-run\") pod \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.267400 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljkwf\" (UniqueName: \"kubernetes.io/projected/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-kube-api-access-ljkwf\") pod \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\" (UID: \"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7\") " Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.267408 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-logs" (OuterVolumeSpecName: "logs") pod "0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" (UID: "0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.267461 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-ovsdbserver-sb\") pod \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.267542 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgd8k\" (UniqueName: \"kubernetes.io/projected/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-kube-api-access-vgd8k\") pod \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\" (UID: \"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1\") " Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.268215 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" (UID: "0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.274356 4851 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.274409 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.282582 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" (UID: "0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.287752 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-scripts" (OuterVolumeSpecName: "scripts") pod "0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" (UID: "0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.297928 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-kube-api-access-ljkwf" (OuterVolumeSpecName: "kube-api-access-ljkwf") pod "0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" (UID: "0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7"). InnerVolumeSpecName "kube-api-access-ljkwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.299180 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-kube-api-access-vgd8k" (OuterVolumeSpecName: "kube-api-access-vgd8k") pod "cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" (UID: "cd0e2b8e-610a-4a44-b7a7-41e7af05fde1"). InnerVolumeSpecName "kube-api-access-vgd8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.318059 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.318074 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7","Type":"ContainerDied","Data":"46d61a47b6bcef4c3a31ac3ae4e8e2d8397605420f1fe3ad4f8326b01c5a2190"} Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.325558 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" event={"ID":"cd0e2b8e-610a-4a44-b7a7-41e7af05fde1","Type":"ContainerDied","Data":"a6fc14c40426a06cd6203a4d72f9c38f10a8b71d9f78438570572446e232b317"} Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.325626 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.340434 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" (UID: "0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.347552 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" (UID: "cd0e2b8e-610a-4a44-b7a7-41e7af05fde1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.373258 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" (UID: "cd0e2b8e-610a-4a44-b7a7-41e7af05fde1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.376743 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljkwf\" (UniqueName: \"kubernetes.io/projected/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-kube-api-access-ljkwf\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.376768 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgd8k\" (UniqueName: \"kubernetes.io/projected/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-kube-api-access-vgd8k\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.376777 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.376785 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.376793 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.376810 4851 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.376820 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.382337 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-config" (OuterVolumeSpecName: "config") pod "cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" (UID: "cd0e2b8e-610a-4a44-b7a7-41e7af05fde1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.399242 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" (UID: "cd0e2b8e-610a-4a44-b7a7-41e7af05fde1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.400114 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" (UID: "cd0e2b8e-610a-4a44-b7a7-41e7af05fde1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.402886 4851 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.410207 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-config-data" (OuterVolumeSpecName: "config-data") pod "0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" (UID: "0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.478391 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.478426 4851 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.478436 4851 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.478445 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.478454 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.657030 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6747f4f9-jm2gc"] Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.669852 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b6747f4f9-jm2gc"] Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.699280 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.711072 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.722309 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:13:10 crc kubenswrapper[4851]: E1001 13:13:10.722906 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerName="init" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.722933 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerName="init" Oct 01 13:13:10 crc kubenswrapper[4851]: E1001 13:13:10.722950 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" containerName="glance-httpd" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.722960 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" containerName="glance-httpd" Oct 01 13:13:10 crc kubenswrapper[4851]: E1001 13:13:10.722982 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" containerName="glance-log" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.722993 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" containerName="glance-log" Oct 01 13:13:10 crc kubenswrapper[4851]: E1001 13:13:10.723025 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerName="dnsmasq-dns" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.723038 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerName="dnsmasq-dns" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.723403 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerName="dnsmasq-dns" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.723510 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" containerName="glance-httpd" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.723530 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" containerName="glance-log" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.753777 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.753929 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.765220 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.765975 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.884936 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44hbg\" (UniqueName: \"kubernetes.io/projected/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-kube-api-access-44hbg\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.884985 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.885020 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.885042 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-logs\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.885172 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.885438 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.885463 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.885543 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.988270 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44hbg\" (UniqueName: \"kubernetes.io/projected/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-kube-api-access-44hbg\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.988347 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.988417 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.988445 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-logs\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.988534 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.988718 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.988742 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.988805 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.989285 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.989338 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.989375 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-logs\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.993376 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.993664 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:10 crc kubenswrapper[4851]: I1001 13:13:10.995128 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:11 crc kubenswrapper[4851]: I1001 13:13:11.005215 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:11 crc kubenswrapper[4851]: I1001 13:13:11.006538 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44hbg\" (UniqueName: \"kubernetes.io/projected/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-kube-api-access-44hbg\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:11 crc kubenswrapper[4851]: I1001 13:13:11.015115 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:13:11 crc kubenswrapper[4851]: I1001 13:13:11.079617 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:13:12 crc kubenswrapper[4851]: I1001 13:13:12.347002 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7" path="/var/lib/kubelet/pods/0d81c0d9-cf73-42b3-bc5a-69ca2699b8a7/volumes" Oct 01 13:13:12 crc kubenswrapper[4851]: I1001 13:13:12.350188 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" path="/var/lib/kubelet/pods/cd0e2b8e-610a-4a44-b7a7-41e7af05fde1/volumes" Oct 01 13:13:12 crc kubenswrapper[4851]: E1001 13:13:12.859658 4851 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 01 13:13:12 crc kubenswrapper[4851]: E1001 13:13:12.859717 4851 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 01 13:13:12 crc kubenswrapper[4851]: E1001 13:13:12.860135 4851 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.36:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hsbs8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9zwxh_openstack(08f27829-7a4b-49d3-aed3-dbae56854228): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:13:12 crc kubenswrapper[4851]: E1001 13:13:12.861716 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9zwxh" podUID="08f27829-7a4b-49d3-aed3-dbae56854228" Oct 01 13:13:12 crc kubenswrapper[4851]: I1001 13:13:12.994022 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:13:12 crc kubenswrapper[4851]: I1001 13:13:12.998841 4851 scope.go:117] "RemoveContainer" containerID="e7f0766ad644552ba0536fb5d35b86be8b435b6b09360208aea4186770f40dae" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.001483 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.017577 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.032278 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qldlg\" (UniqueName: \"kubernetes.io/projected/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-kube-api-access-qldlg\") pod \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.032647 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-scripts\") pod \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.032687 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-logs\") pod \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.032705 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9843bd36-0528-4494-9b3f-f1975e2b15f2-logs\") pod \"9843bd36-0528-4494-9b3f-f1975e2b15f2\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.032767 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-config-data\") pod \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.032817 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pmds\" (UniqueName: \"kubernetes.io/projected/9843bd36-0528-4494-9b3f-f1975e2b15f2-kube-api-access-5pmds\") pod \"9843bd36-0528-4494-9b3f-f1975e2b15f2\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.032853 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9843bd36-0528-4494-9b3f-f1975e2b15f2-horizon-secret-key\") pod \"9843bd36-0528-4494-9b3f-f1975e2b15f2\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.032872 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9843bd36-0528-4494-9b3f-f1975e2b15f2-config-data\") pod \"9843bd36-0528-4494-9b3f-f1975e2b15f2\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.032937 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-horizon-secret-key\") pod \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\" (UID: \"5d3e8ca0-bbef-4757-962b-68ea6f20af9a\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.032960 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9843bd36-0528-4494-9b3f-f1975e2b15f2-scripts\") pod \"9843bd36-0528-4494-9b3f-f1975e2b15f2\" (UID: \"9843bd36-0528-4494-9b3f-f1975e2b15f2\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.033917 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9843bd36-0528-4494-9b3f-f1975e2b15f2-scripts" (OuterVolumeSpecName: "scripts") pod "9843bd36-0528-4494-9b3f-f1975e2b15f2" (UID: "9843bd36-0528-4494-9b3f-f1975e2b15f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.034359 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9843bd36-0528-4494-9b3f-f1975e2b15f2-logs" (OuterVolumeSpecName: "logs") pod "9843bd36-0528-4494-9b3f-f1975e2b15f2" (UID: "9843bd36-0528-4494-9b3f-f1975e2b15f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.034377 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-scripts" (OuterVolumeSpecName: "scripts") pod "5d3e8ca0-bbef-4757-962b-68ea6f20af9a" (UID: "5d3e8ca0-bbef-4757-962b-68ea6f20af9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.034572 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9843bd36-0528-4494-9b3f-f1975e2b15f2-config-data" (OuterVolumeSpecName: "config-data") pod "9843bd36-0528-4494-9b3f-f1975e2b15f2" (UID: "9843bd36-0528-4494-9b3f-f1975e2b15f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.034979 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-logs" (OuterVolumeSpecName: "logs") pod "5d3e8ca0-bbef-4757-962b-68ea6f20af9a" (UID: "5d3e8ca0-bbef-4757-962b-68ea6f20af9a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.035373 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-config-data" (OuterVolumeSpecName: "config-data") pod "5d3e8ca0-bbef-4757-962b-68ea6f20af9a" (UID: "5d3e8ca0-bbef-4757-962b-68ea6f20af9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.038659 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9843bd36-0528-4494-9b3f-f1975e2b15f2-kube-api-access-5pmds" (OuterVolumeSpecName: "kube-api-access-5pmds") pod "9843bd36-0528-4494-9b3f-f1975e2b15f2" (UID: "9843bd36-0528-4494-9b3f-f1975e2b15f2"). InnerVolumeSpecName "kube-api-access-5pmds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.038675 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9843bd36-0528-4494-9b3f-f1975e2b15f2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9843bd36-0528-4494-9b3f-f1975e2b15f2" (UID: "9843bd36-0528-4494-9b3f-f1975e2b15f2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.040348 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-kube-api-access-qldlg" (OuterVolumeSpecName: "kube-api-access-qldlg") pod "5d3e8ca0-bbef-4757-962b-68ea6f20af9a" (UID: "5d3e8ca0-bbef-4757-962b-68ea6f20af9a"). InnerVolumeSpecName "kube-api-access-qldlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.043642 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5d3e8ca0-bbef-4757-962b-68ea6f20af9a" (UID: "5d3e8ca0-bbef-4757-962b-68ea6f20af9a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.124683 4851 scope.go:117] "RemoveContainer" containerID="bd68d31381d20fb45c7d0cf170937377640a94cb2ee1ec77d59b0165c23c3a88" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.134436 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ac095f6-4433-4c0e-9299-dcad17dff9fa-scripts\") pod \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.134596 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs4rg\" (UniqueName: \"kubernetes.io/projected/2ac095f6-4433-4c0e-9299-dcad17dff9fa-kube-api-access-xs4rg\") pod \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.134668 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ac095f6-4433-4c0e-9299-dcad17dff9fa-logs\") pod \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.134719 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ac095f6-4433-4c0e-9299-dcad17dff9fa-horizon-secret-key\") pod \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.134821 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ac095f6-4433-4c0e-9299-dcad17dff9fa-config-data\") pod \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\" (UID: \"2ac095f6-4433-4c0e-9299-dcad17dff9fa\") " Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.135152 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qldlg\" (UniqueName: \"kubernetes.io/projected/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-kube-api-access-qldlg\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.135172 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.135181 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.135189 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9843bd36-0528-4494-9b3f-f1975e2b15f2-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.135199 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.135208 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pmds\" (UniqueName: \"kubernetes.io/projected/9843bd36-0528-4494-9b3f-f1975e2b15f2-kube-api-access-5pmds\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.135218 4851 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9843bd36-0528-4494-9b3f-f1975e2b15f2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.135227 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9843bd36-0528-4494-9b3f-f1975e2b15f2-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.135235 4851 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d3e8ca0-bbef-4757-962b-68ea6f20af9a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.135242 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9843bd36-0528-4494-9b3f-f1975e2b15f2-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.136037 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac095f6-4433-4c0e-9299-dcad17dff9fa-scripts" (OuterVolumeSpecName: "scripts") pod "2ac095f6-4433-4c0e-9299-dcad17dff9fa" (UID: "2ac095f6-4433-4c0e-9299-dcad17dff9fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.136258 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ac095f6-4433-4c0e-9299-dcad17dff9fa-logs" (OuterVolumeSpecName: "logs") pod "2ac095f6-4433-4c0e-9299-dcad17dff9fa" (UID: "2ac095f6-4433-4c0e-9299-dcad17dff9fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.138008 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac095f6-4433-4c0e-9299-dcad17dff9fa-config-data" (OuterVolumeSpecName: "config-data") pod "2ac095f6-4433-4c0e-9299-dcad17dff9fa" (UID: "2ac095f6-4433-4c0e-9299-dcad17dff9fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.139073 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ac095f6-4433-4c0e-9299-dcad17dff9fa-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2ac095f6-4433-4c0e-9299-dcad17dff9fa" (UID: "2ac095f6-4433-4c0e-9299-dcad17dff9fa"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.140749 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac095f6-4433-4c0e-9299-dcad17dff9fa-kube-api-access-xs4rg" (OuterVolumeSpecName: "kube-api-access-xs4rg") pod "2ac095f6-4433-4c0e-9299-dcad17dff9fa" (UID: "2ac095f6-4433-4c0e-9299-dcad17dff9fa"). InnerVolumeSpecName "kube-api-access-xs4rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.195848 4851 scope.go:117] "RemoveContainer" containerID="0a5516bbe7b63d7f5a6de326a8acefedb9dac9702050318a7689407ae1994a0a" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.236452 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ac095f6-4433-4c0e-9299-dcad17dff9fa-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.236479 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ac095f6-4433-4c0e-9299-dcad17dff9fa-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.236491 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs4rg\" (UniqueName: \"kubernetes.io/projected/2ac095f6-4433-4c0e-9299-dcad17dff9fa-kube-api-access-xs4rg\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.238127 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ac095f6-4433-4c0e-9299-dcad17dff9fa-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.238138 4851 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ac095f6-4433-4c0e-9299-dcad17dff9fa-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.264145 4851 scope.go:117] "RemoveContainer" containerID="772d5d5852d36e36f1bb3aaae8b3f5b540ed039eda11764bc681d2622ac6dde5" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.364511 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-857f448949-w9w5f" event={"ID":"5d3e8ca0-bbef-4757-962b-68ea6f20af9a","Type":"ContainerDied","Data":"79a580227f4b324e31e7113073a52f17b5fbae984c47d5ae9fa6244a6f53c1fa"} Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.364569 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-857f448949-w9w5f" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.372727 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86559649f7-slxjz" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.372729 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86559649f7-slxjz" event={"ID":"9843bd36-0528-4494-9b3f-f1975e2b15f2","Type":"ContainerDied","Data":"eb225048fab6e786ac8ac135ec5ab7489e7ab2194762457f9f1fd7e4d0a59547"} Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.378535 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-ddb6c45b5-ncjrz" event={"ID":"2ac095f6-4433-4c0e-9299-dcad17dff9fa","Type":"ContainerDied","Data":"583508f75a92ef78d3baa331640d1923a76e2a860b71def8ca05dbee31bc2465"} Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.378617 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-ddb6c45b5-ncjrz" Oct 01 13:13:13 crc kubenswrapper[4851]: E1001 13:13:13.387629 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-9zwxh" podUID="08f27829-7a4b-49d3-aed3-dbae56854228" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.416132 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.464719 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b6747f4f9-jm2gc" podUID="cd0e2b8e-610a-4a44-b7a7-41e7af05fde1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: i/o timeout" Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.530427 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86559649f7-slxjz"] Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.544673 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-86559649f7-slxjz"] Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.565774 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-ddb6c45b5-ncjrz"] Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.574745 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-ddb6c45b5-ncjrz"] Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.591820 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-857f448949-w9w5f"] Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.600745 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-857f448949-w9w5f"] Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.807330 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fvbs7"] Oct 01 13:13:13 crc kubenswrapper[4851]: W1001 13:13:13.834092 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfc29451_e27b_4bd0_9a5f_6a177e7621be.slice/crio-ba1a72275b809385d92871dc9d3c962a81158ed9e9aeb716e8a510bb35196b7f WatchSource:0}: Error finding container ba1a72275b809385d92871dc9d3c962a81158ed9e9aeb716e8a510bb35196b7f: Status 404 returned error can't find the container with id ba1a72275b809385d92871dc9d3c962a81158ed9e9aeb716e8a510bb35196b7f Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.865776 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58c9859d68-bckn5"] Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.885235 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67b666754b-b52ns"] Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.891842 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.916640 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.928530 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5s4gk"] Oct 01 13:13:13 crc kubenswrapper[4851]: I1001 13:13:13.985241 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:13:14 crc kubenswrapper[4851]: W1001 13:13:14.018455 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd713b6ba_8532_4b3c_aa32_c97d5547ea62.slice/crio-793317acdf31b9e0cbe5771833bcd4a2f078f89809666297fa4c89b4d55d5ee4 WatchSource:0}: Error finding container 793317acdf31b9e0cbe5771833bcd4a2f078f89809666297fa4c89b4d55d5ee4: Status 404 returned error can't find the container with id 793317acdf31b9e0cbe5771833bcd4a2f078f89809666297fa4c89b4d55d5ee4 Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.081840 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.345687 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac095f6-4433-4c0e-9299-dcad17dff9fa" path="/var/lib/kubelet/pods/2ac095f6-4433-4c0e-9299-dcad17dff9fa/volumes" Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.346476 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d3e8ca0-bbef-4757-962b-68ea6f20af9a" path="/var/lib/kubelet/pods/5d3e8ca0-bbef-4757-962b-68ea6f20af9a/volumes" Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.346912 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9843bd36-0528-4494-9b3f-f1975e2b15f2" path="/var/lib/kubelet/pods/9843bd36-0528-4494-9b3f-f1975e2b15f2/volumes" Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.391718 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gk28q" event={"ID":"a1eb277b-5338-4d54-939f-6d636f34d14c","Type":"ContainerStarted","Data":"87f349687f0c6455251e99734637cc08652e820d4dd4e32b98d2c0cb03974969"} Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.396234 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55afd90b-7461-4db7-89a6-d45f9bdcb1b3","Type":"ContainerStarted","Data":"11a1c08f2113a358b5d22ecdf09d9cd74ac93459e063a5735048c02487b5cb3d"} Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.397752 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fvbs7" event={"ID":"bfc29451-e27b-4bd0-9a5f-6a177e7621be","Type":"ContainerStarted","Data":"ba1a72275b809385d92871dc9d3c962a81158ed9e9aeb716e8a510bb35196b7f"} Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.399326 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"cad65438-644c-45b7-8267-370126fe6aef","Type":"ContainerStarted","Data":"0dea95ca0cb3b278c30674a90de770c0df959b3d1682eae8e619cdf0c3995d49"} Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.404676 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"971ca0ac-6de7-42f1-bf29-5174fd80ced4","Type":"ContainerStarted","Data":"9366cb5c601a7294de508ed1171fc1607fc476a9ce53000f700caf214c21b377"} Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.405986 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b666754b-b52ns" event={"ID":"1db63449-71cb-4fa2-86db-43a83a914643","Type":"ContainerStarted","Data":"a0d437c9c4f231e21a8f5468d45f8371f0cd0f52b2d05edefaf9a17aa828647a"} Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.416272 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58c9859d68-bckn5" event={"ID":"2fcf93f8-06db-4cab-8699-9051ca2ae50a","Type":"ContainerStarted","Data":"9298bbc7789cad20862dab7a83b6ef673dcec807ae977a9d3ba1abe036fbe50c"} Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.418129 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"556db500-a192-4dfa-8a7f-4ab1d1e7e15b","Type":"ContainerStarted","Data":"572b4c97eb7a514af19a21b9fb1ee8427107ac82603d916c74fcc825659e3f41"} Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.419888 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l9xpp" event={"ID":"d64a19f8-04aa-4c82-9818-6a50d9e3d62b","Type":"ContainerStarted","Data":"8ba017052fe7382141520764bcb6eefb2e1e996457cd67073bd2717c8ca8b391"} Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.420154 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-gk28q" podStartSLOduration=3.680662495 podStartE2EDuration="52.420134355s" podCreationTimestamp="2025-10-01 13:12:22 +0000 UTC" firstStartedPulling="2025-10-01 13:12:24.385084213 +0000 UTC m=+1152.730201699" lastFinishedPulling="2025-10-01 13:13:13.124556073 +0000 UTC m=+1201.469673559" observedRunningTime="2025-10-01 13:13:14.410492399 +0000 UTC m=+1202.755609905" watchObservedRunningTime="2025-10-01 13:13:14.420134355 +0000 UTC m=+1202.765251831" Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.422419 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02522c45-cac6-496b-b1dc-83275b924b18","Type":"ContainerStarted","Data":"331c51339c1570f05b723f2d9231105473c006fd4d3ea5e3c7f2d2e9306c1741"} Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.422551 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="02522c45-cac6-496b-b1dc-83275b924b18" containerName="glance-log" containerID="cri-o://884f973bd7f32cb81005f66435625de6d609be575f825ef0e23fc17961a52f77" gracePeriod=30 Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.422643 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="02522c45-cac6-496b-b1dc-83275b924b18" containerName="glance-httpd" containerID="cri-o://331c51339c1570f05b723f2d9231105473c006fd4d3ea5e3c7f2d2e9306c1741" gracePeriod=30 Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.425609 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d713b6ba-8532-4b3c-aa32-c97d5547ea62","Type":"ContainerStarted","Data":"793317acdf31b9e0cbe5771833bcd4a2f078f89809666297fa4c89b4d55d5ee4"} Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.427606 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5s4gk" event={"ID":"9adae7cd-3c60-49d2-9048-138c3050d6f4","Type":"ContainerStarted","Data":"03dfd31b0da8d89f48a758ce8d7080cfdbff8b8d9cceb9c284fd2ddc8d2d3984"} Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.437000 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-l9xpp" podStartSLOduration=3.333020156 podStartE2EDuration="51.436983555s" podCreationTimestamp="2025-10-01 13:12:23 +0000 UTC" firstStartedPulling="2025-10-01 13:12:25.07511929 +0000 UTC m=+1153.420236776" lastFinishedPulling="2025-10-01 13:13:13.179082689 +0000 UTC m=+1201.524200175" observedRunningTime="2025-10-01 13:13:14.43433769 +0000 UTC m=+1202.779455186" watchObservedRunningTime="2025-10-01 13:13:14.436983555 +0000 UTC m=+1202.782101041" Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.438320 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"45376e593b7f479231d2d5e58c334337acd5d47c4a95ca6e3b37f5047096d591"} Oct 01 13:13:14 crc kubenswrapper[4851]: I1001 13:13:14.455717 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=49.45570149 podStartE2EDuration="49.45570149s" podCreationTimestamp="2025-10-01 13:12:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:14.449607426 +0000 UTC m=+1202.794724932" watchObservedRunningTime="2025-10-01 13:13:14.45570149 +0000 UTC m=+1202.800818976" Oct 01 13:13:15 crc kubenswrapper[4851]: I1001 13:13:15.456347 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fvbs7" event={"ID":"bfc29451-e27b-4bd0-9a5f-6a177e7621be","Type":"ContainerStarted","Data":"b34c5f5f25ad1f532fa333e66fa3f7f64c12594ef6c6d5ee4b481d76a2a5fbbf"} Oct 01 13:13:15 crc kubenswrapper[4851]: I1001 13:13:15.464677 4851 generic.go:334] "Generic (PLEG): container finished" podID="02522c45-cac6-496b-b1dc-83275b924b18" containerID="331c51339c1570f05b723f2d9231105473c006fd4d3ea5e3c7f2d2e9306c1741" exitCode=0 Oct 01 13:13:15 crc kubenswrapper[4851]: I1001 13:13:15.464771 4851 generic.go:334] "Generic (PLEG): container finished" podID="02522c45-cac6-496b-b1dc-83275b924b18" containerID="884f973bd7f32cb81005f66435625de6d609be575f825ef0e23fc17961a52f77" exitCode=143 Oct 01 13:13:15 crc kubenswrapper[4851]: I1001 13:13:15.464730 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02522c45-cac6-496b-b1dc-83275b924b18","Type":"ContainerDied","Data":"331c51339c1570f05b723f2d9231105473c006fd4d3ea5e3c7f2d2e9306c1741"} Oct 01 13:13:15 crc kubenswrapper[4851]: I1001 13:13:15.464851 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02522c45-cac6-496b-b1dc-83275b924b18","Type":"ContainerDied","Data":"884f973bd7f32cb81005f66435625de6d609be575f825ef0e23fc17961a52f77"} Oct 01 13:13:15 crc kubenswrapper[4851]: I1001 13:13:15.468220 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d713b6ba-8532-4b3c-aa32-c97d5547ea62","Type":"ContainerStarted","Data":"2543525a6144b684b2a662ffa5887481abaea2d9e7bcb0361bfdff0ad91e3504"} Oct 01 13:13:15 crc kubenswrapper[4851]: I1001 13:13:15.470158 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b666754b-b52ns" event={"ID":"1db63449-71cb-4fa2-86db-43a83a914643","Type":"ContainerStarted","Data":"c39866bdf066b1da181c5a3017d9775a7100cc6f4794767d4519b16f3085d3af"} Oct 01 13:13:15 crc kubenswrapper[4851]: I1001 13:13:15.472622 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58c9859d68-bckn5" event={"ID":"2fcf93f8-06db-4cab-8699-9051ca2ae50a","Type":"ContainerStarted","Data":"697fd980f958fbab33e2c44eb69b97ee9de5aa97ed975094ba7b6d1572c196d9"} Oct 01 13:13:15 crc kubenswrapper[4851]: I1001 13:13:15.480129 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"556db500-a192-4dfa-8a7f-4ab1d1e7e15b","Type":"ContainerStarted","Data":"e64190c7410f428cda3cb53f01f5f24b91b5ecd57305ce3aaaffc33f3c6acb7e"} Oct 01 13:13:15 crc kubenswrapper[4851]: I1001 13:13:15.482828 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5s4gk" event={"ID":"9adae7cd-3c60-49d2-9048-138c3050d6f4","Type":"ContainerStarted","Data":"0ff455f8bbac0fc423335e4fe194bf52dc1d6f9849ca6aee36e83f57cceff6e1"} Oct 01 13:13:15 crc kubenswrapper[4851]: I1001 13:13:15.485561 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fvbs7" podStartSLOduration=46.485522737 podStartE2EDuration="46.485522737s" podCreationTimestamp="2025-10-01 13:12:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:15.474731999 +0000 UTC m=+1203.819849495" watchObservedRunningTime="2025-10-01 13:13:15.485522737 +0000 UTC m=+1203.830640283" Oct 01 13:13:15 crc kubenswrapper[4851]: I1001 13:13:15.505922 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5s4gk" podStartSLOduration=21.505890898 podStartE2EDuration="21.505890898s" podCreationTimestamp="2025-10-01 13:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:15.501698528 +0000 UTC m=+1203.846816014" watchObservedRunningTime="2025-10-01 13:13:15.505890898 +0000 UTC m=+1203.851008394" Oct 01 13:13:16 crc kubenswrapper[4851]: I1001 13:13:16.976806 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.123104 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02522c45-cac6-496b-b1dc-83275b924b18-logs\") pod \"02522c45-cac6-496b-b1dc-83275b924b18\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.123180 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-combined-ca-bundle\") pod \"02522c45-cac6-496b-b1dc-83275b924b18\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.123256 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-config-data\") pod \"02522c45-cac6-496b-b1dc-83275b924b18\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.123309 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gvvc\" (UniqueName: \"kubernetes.io/projected/02522c45-cac6-496b-b1dc-83275b924b18-kube-api-access-4gvvc\") pod \"02522c45-cac6-496b-b1dc-83275b924b18\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.123383 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-scripts\") pod \"02522c45-cac6-496b-b1dc-83275b924b18\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.123589 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02522c45-cac6-496b-b1dc-83275b924b18-httpd-run\") pod \"02522c45-cac6-496b-b1dc-83275b924b18\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.123655 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"02522c45-cac6-496b-b1dc-83275b924b18\" (UID: \"02522c45-cac6-496b-b1dc-83275b924b18\") " Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.124677 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02522c45-cac6-496b-b1dc-83275b924b18-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "02522c45-cac6-496b-b1dc-83275b924b18" (UID: "02522c45-cac6-496b-b1dc-83275b924b18"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.124876 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02522c45-cac6-496b-b1dc-83275b924b18-logs" (OuterVolumeSpecName: "logs") pod "02522c45-cac6-496b-b1dc-83275b924b18" (UID: "02522c45-cac6-496b-b1dc-83275b924b18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.125042 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02522c45-cac6-496b-b1dc-83275b924b18-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.125053 4851 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02522c45-cac6-496b-b1dc-83275b924b18-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.127760 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-scripts" (OuterVolumeSpecName: "scripts") pod "02522c45-cac6-496b-b1dc-83275b924b18" (UID: "02522c45-cac6-496b-b1dc-83275b924b18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.130755 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "02522c45-cac6-496b-b1dc-83275b924b18" (UID: "02522c45-cac6-496b-b1dc-83275b924b18"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.130817 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02522c45-cac6-496b-b1dc-83275b924b18-kube-api-access-4gvvc" (OuterVolumeSpecName: "kube-api-access-4gvvc") pod "02522c45-cac6-496b-b1dc-83275b924b18" (UID: "02522c45-cac6-496b-b1dc-83275b924b18"). InnerVolumeSpecName "kube-api-access-4gvvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.211666 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02522c45-cac6-496b-b1dc-83275b924b18" (UID: "02522c45-cac6-496b-b1dc-83275b924b18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.225619 4851 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.225644 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.225654 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gvvc\" (UniqueName: \"kubernetes.io/projected/02522c45-cac6-496b-b1dc-83275b924b18-kube-api-access-4gvvc\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.225664 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.294190 4851 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.326986 4851 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.341474 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-config-data" (OuterVolumeSpecName: "config-data") pod "02522c45-cac6-496b-b1dc-83275b924b18" (UID: "02522c45-cac6-496b-b1dc-83275b924b18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.428301 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02522c45-cac6-496b-b1dc-83275b924b18-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.511859 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b666754b-b52ns" event={"ID":"1db63449-71cb-4fa2-86db-43a83a914643","Type":"ContainerStarted","Data":"1566f50774133ea95c6a0e9c324d2e5bcf12901648eb5a468022250e57e983e9"} Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.516260 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58c9859d68-bckn5" event={"ID":"2fcf93f8-06db-4cab-8699-9051ca2ae50a","Type":"ContainerStarted","Data":"f8db4b5c8004d53026ce762ed28dab9e086f5f0bbbffe487f5dad8c8ef0955b1"} Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.521144 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"cad65438-644c-45b7-8267-370126fe6aef","Type":"ContainerStarted","Data":"c18de202bf9893d951811edfa78db130a0c52b5ddb89a05d9a9bbe0b7653f728"} Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.534128 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55afd90b-7461-4db7-89a6-d45f9bdcb1b3","Type":"ContainerStarted","Data":"6a062c003a7de76b285243f92119681912715e0c2f6b637150600517b03f8579"} Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.538659 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"971ca0ac-6de7-42f1-bf29-5174fd80ced4","Type":"ContainerStarted","Data":"d55abae016177de4dd6c005c3c19e8a7fcaf164791a47b7bbc2c48b5523a7265"} Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.547369 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02522c45-cac6-496b-b1dc-83275b924b18","Type":"ContainerDied","Data":"30a02d3ef04d8380aa92ea2740377a5edc0ae204e3ffa0aeaddf8e98ca86a664"} Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.547442 4851 scope.go:117] "RemoveContainer" containerID="331c51339c1570f05b723f2d9231105473c006fd4d3ea5e3c7f2d2e9306c1741" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.547642 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.554355 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=20.442577016 podStartE2EDuration="23.554338201s" podCreationTimestamp="2025-10-01 13:12:54 +0000 UTC" firstStartedPulling="2025-10-01 13:13:13.879665067 +0000 UTC m=+1202.224782553" lastFinishedPulling="2025-10-01 13:13:16.991426242 +0000 UTC m=+1205.336543738" observedRunningTime="2025-10-01 13:13:17.55012097 +0000 UTC m=+1205.895238456" watchObservedRunningTime="2025-10-01 13:13:17.554338201 +0000 UTC m=+1205.899455687" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.557787 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67b666754b-b52ns" podStartSLOduration=42.358025487 podStartE2EDuration="42.557776249s" podCreationTimestamp="2025-10-01 13:12:35 +0000 UTC" firstStartedPulling="2025-10-01 13:13:13.874839319 +0000 UTC m=+1202.219956805" lastFinishedPulling="2025-10-01 13:13:14.074590081 +0000 UTC m=+1202.419707567" observedRunningTime="2025-10-01 13:13:17.530453259 +0000 UTC m=+1205.875570745" watchObservedRunningTime="2025-10-01 13:13:17.557776249 +0000 UTC m=+1205.902893735" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.573237 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d713b6ba-8532-4b3c-aa32-c97d5547ea62","Type":"ContainerStarted","Data":"e876dc9b41583fe6cd743a4b16f165959cc10fd52c92e16f3cd1a98200d6c5c1"} Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.574531 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.592580 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-58c9859d68-bckn5" podStartSLOduration=42.469052916 podStartE2EDuration="42.592558312s" podCreationTimestamp="2025-10-01 13:12:35 +0000 UTC" firstStartedPulling="2025-10-01 13:13:13.856533716 +0000 UTC m=+1202.201651202" lastFinishedPulling="2025-10-01 13:13:13.980039112 +0000 UTC m=+1202.325156598" observedRunningTime="2025-10-01 13:13:17.569854434 +0000 UTC m=+1205.914971920" watchObservedRunningTime="2025-10-01 13:13:17.592558312 +0000 UTC m=+1205.937675808" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.613514 4851 scope.go:117] "RemoveContainer" containerID="884f973bd7f32cb81005f66435625de6d609be575f825ef0e23fc17961a52f77" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.636018 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=20.496655719 podStartE2EDuration="23.635996802s" podCreationTimestamp="2025-10-01 13:12:54 +0000 UTC" firstStartedPulling="2025-10-01 13:13:13.850847794 +0000 UTC m=+1202.195965280" lastFinishedPulling="2025-10-01 13:13:16.990188877 +0000 UTC m=+1205.335306363" observedRunningTime="2025-10-01 13:13:17.599975604 +0000 UTC m=+1205.945093110" watchObservedRunningTime="2025-10-01 13:13:17.635996802 +0000 UTC m=+1205.981114288" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.636157 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=23.636152266 podStartE2EDuration="23.636152266s" podCreationTimestamp="2025-10-01 13:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:17.621478797 +0000 UTC m=+1205.966596283" watchObservedRunningTime="2025-10-01 13:13:17.636152266 +0000 UTC m=+1205.981269752" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.665691 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.687509 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.695568 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:13:17 crc kubenswrapper[4851]: E1001 13:13:17.696158 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02522c45-cac6-496b-b1dc-83275b924b18" containerName="glance-log" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.696184 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="02522c45-cac6-496b-b1dc-83275b924b18" containerName="glance-log" Oct 01 13:13:17 crc kubenswrapper[4851]: E1001 13:13:17.696213 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02522c45-cac6-496b-b1dc-83275b924b18" containerName="glance-httpd" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.696224 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="02522c45-cac6-496b-b1dc-83275b924b18" containerName="glance-httpd" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.696519 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="02522c45-cac6-496b-b1dc-83275b924b18" containerName="glance-httpd" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.696540 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="02522c45-cac6-496b-b1dc-83275b924b18" containerName="glance-log" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.697868 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.701687 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.707513 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.708254 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.842755 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.842812 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39c1516d-1f7f-4814-b712-cfa3355ded27-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.842867 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.842918 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-scripts\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.842943 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39c1516d-1f7f-4814-b712-cfa3355ded27-logs\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.843061 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-config-data\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.843095 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.843152 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmw5w\" (UniqueName: \"kubernetes.io/projected/39c1516d-1f7f-4814-b712-cfa3355ded27-kube-api-access-zmw5w\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.945288 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39c1516d-1f7f-4814-b712-cfa3355ded27-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.945359 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.945426 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-scripts\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.945455 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39c1516d-1f7f-4814-b712-cfa3355ded27-logs\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.945490 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-config-data\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.945538 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.945572 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmw5w\" (UniqueName: \"kubernetes.io/projected/39c1516d-1f7f-4814-b712-cfa3355ded27-kube-api-access-zmw5w\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.945712 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.945760 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.945971 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39c1516d-1f7f-4814-b712-cfa3355ded27-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.946862 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39c1516d-1f7f-4814-b712-cfa3355ded27-logs\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.951784 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-config-data\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.963467 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.965784 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.971041 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-scripts\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.983818 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmw5w\" (UniqueName: \"kubernetes.io/projected/39c1516d-1f7f-4814-b712-cfa3355ded27-kube-api-access-zmw5w\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:17 crc kubenswrapper[4851]: I1001 13:13:17.991926 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " pod="openstack/glance-default-external-api-0" Oct 01 13:13:18 crc kubenswrapper[4851]: I1001 13:13:18.026327 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:13:18 crc kubenswrapper[4851]: I1001 13:13:18.343919 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02522c45-cac6-496b-b1dc-83275b924b18" path="/var/lib/kubelet/pods/02522c45-cac6-496b-b1dc-83275b924b18/volumes" Oct 01 13:13:18 crc kubenswrapper[4851]: I1001 13:13:18.590189 4851 generic.go:334] "Generic (PLEG): container finished" podID="9adae7cd-3c60-49d2-9048-138c3050d6f4" containerID="0ff455f8bbac0fc423335e4fe194bf52dc1d6f9849ca6aee36e83f57cceff6e1" exitCode=0 Oct 01 13:13:18 crc kubenswrapper[4851]: I1001 13:13:18.590265 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5s4gk" event={"ID":"9adae7cd-3c60-49d2-9048-138c3050d6f4","Type":"ContainerDied","Data":"0ff455f8bbac0fc423335e4fe194bf52dc1d6f9849ca6aee36e83f57cceff6e1"} Oct 01 13:13:18 crc kubenswrapper[4851]: I1001 13:13:18.603431 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"556db500-a192-4dfa-8a7f-4ab1d1e7e15b","Type":"ContainerStarted","Data":"3c62828291e0d54dc121908d792d7badaec5beb089700b9e32d232ebbc2d7ccd"} Oct 01 13:13:18 crc kubenswrapper[4851]: I1001 13:13:18.652051 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:13:18 crc kubenswrapper[4851]: I1001 13:13:18.662254 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.662238706 podStartE2EDuration="8.662238706s" podCreationTimestamp="2025-10-01 13:13:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:18.639468726 +0000 UTC m=+1206.984586212" watchObservedRunningTime="2025-10-01 13:13:18.662238706 +0000 UTC m=+1207.007356192" Oct 01 13:13:19 crc kubenswrapper[4851]: I1001 13:13:19.619489 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"39c1516d-1f7f-4814-b712-cfa3355ded27","Type":"ContainerStarted","Data":"81d8977f210e5feb5373c99f20ced4dbc1d5da428cccf180185431132777899a"} Oct 01 13:13:19 crc kubenswrapper[4851]: I1001 13:13:19.619816 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"39c1516d-1f7f-4814-b712-cfa3355ded27","Type":"ContainerStarted","Data":"87498b7b2cb915c69f1b7f4b83154dc6382ada6e87d20f2ed02fbcc90b02aad6"} Oct 01 13:13:19 crc kubenswrapper[4851]: I1001 13:13:19.619664 4851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.116371 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.197957 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bflkj\" (UniqueName: \"kubernetes.io/projected/9adae7cd-3c60-49d2-9048-138c3050d6f4-kube-api-access-bflkj\") pod \"9adae7cd-3c60-49d2-9048-138c3050d6f4\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.198016 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-combined-ca-bundle\") pod \"9adae7cd-3c60-49d2-9048-138c3050d6f4\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.198042 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-config-data\") pod \"9adae7cd-3c60-49d2-9048-138c3050d6f4\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.198186 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-credential-keys\") pod \"9adae7cd-3c60-49d2-9048-138c3050d6f4\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.198233 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-fernet-keys\") pod \"9adae7cd-3c60-49d2-9048-138c3050d6f4\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.198265 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-scripts\") pod \"9adae7cd-3c60-49d2-9048-138c3050d6f4\" (UID: \"9adae7cd-3c60-49d2-9048-138c3050d6f4\") " Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.203746 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9adae7cd-3c60-49d2-9048-138c3050d6f4" (UID: "9adae7cd-3c60-49d2-9048-138c3050d6f4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.219966 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-scripts" (OuterVolumeSpecName: "scripts") pod "9adae7cd-3c60-49d2-9048-138c3050d6f4" (UID: "9adae7cd-3c60-49d2-9048-138c3050d6f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.220145 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9adae7cd-3c60-49d2-9048-138c3050d6f4" (UID: "9adae7cd-3c60-49d2-9048-138c3050d6f4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.223563 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9adae7cd-3c60-49d2-9048-138c3050d6f4-kube-api-access-bflkj" (OuterVolumeSpecName: "kube-api-access-bflkj") pod "9adae7cd-3c60-49d2-9048-138c3050d6f4" (UID: "9adae7cd-3c60-49d2-9048-138c3050d6f4"). InnerVolumeSpecName "kube-api-access-bflkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.231915 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9adae7cd-3c60-49d2-9048-138c3050d6f4" (UID: "9adae7cd-3c60-49d2-9048-138c3050d6f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.236356 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-config-data" (OuterVolumeSpecName: "config-data") pod "9adae7cd-3c60-49d2-9048-138c3050d6f4" (UID: "9adae7cd-3c60-49d2-9048-138c3050d6f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.301470 4851 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.301620 4851 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.301637 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.301679 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bflkj\" (UniqueName: \"kubernetes.io/projected/9adae7cd-3c60-49d2-9048-138c3050d6f4-kube-api-access-bflkj\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.301702 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.301718 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9adae7cd-3c60-49d2-9048-138c3050d6f4-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.350876 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.352462 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.370358 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.654174 4851 generic.go:334] "Generic (PLEG): container finished" podID="a1eb277b-5338-4d54-939f-6d636f34d14c" containerID="87f349687f0c6455251e99734637cc08652e820d4dd4e32b98d2c0cb03974969" exitCode=0 Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.654250 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gk28q" event={"ID":"a1eb277b-5338-4d54-939f-6d636f34d14c","Type":"ContainerDied","Data":"87f349687f0c6455251e99734637cc08652e820d4dd4e32b98d2c0cb03974969"} Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.657864 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5s4gk" event={"ID":"9adae7cd-3c60-49d2-9048-138c3050d6f4","Type":"ContainerDied","Data":"03dfd31b0da8d89f48a758ce8d7080cfdbff8b8d9cceb9c284fd2ddc8d2d3984"} Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.657889 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03dfd31b0da8d89f48a758ce8d7080cfdbff8b8d9cceb9c284fd2ddc8d2d3984" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.657906 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5s4gk" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.673484 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"39c1516d-1f7f-4814-b712-cfa3355ded27","Type":"ContainerStarted","Data":"bf2cda50c2b2b074f7d44b4f289e0e262020b11056db7c66dad943b5eed457fb"} Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.702073 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-78d78d4545-tv25n"] Oct 01 13:13:20 crc kubenswrapper[4851]: E1001 13:13:20.702565 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9adae7cd-3c60-49d2-9048-138c3050d6f4" containerName="keystone-bootstrap" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.702583 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="9adae7cd-3c60-49d2-9048-138c3050d6f4" containerName="keystone-bootstrap" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.702792 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="9adae7cd-3c60-49d2-9048-138c3050d6f4" containerName="keystone-bootstrap" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.703462 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.706871 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.70684778 podStartE2EDuration="3.70684778s" podCreationTimestamp="2025-10-01 13:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:20.698297686 +0000 UTC m=+1209.043415192" watchObservedRunningTime="2025-10-01 13:13:20.70684778 +0000 UTC m=+1209.051965266" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.709346 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.709569 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vxcc4" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.709732 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.709818 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.709899 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.709965 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.735907 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-78d78d4545-tv25n"] Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.817324 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-combined-ca-bundle\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.817446 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-public-tls-certs\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.817565 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-scripts\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.818273 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-credential-keys\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.818291 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-fernet-keys\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.818336 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-config-data\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.818497 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-internal-tls-certs\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.819483 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcs9t\" (UniqueName: \"kubernetes.io/projected/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-kube-api-access-kcs9t\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.929617 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-combined-ca-bundle\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.929690 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-public-tls-certs\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.929724 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-scripts\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.929748 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-credential-keys\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.929771 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-fernet-keys\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.929802 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-config-data\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.929881 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-internal-tls-certs\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.929911 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcs9t\" (UniqueName: \"kubernetes.io/projected/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-kube-api-access-kcs9t\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.934070 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-combined-ca-bundle\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.936767 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-scripts\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.937950 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-fernet-keys\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.940335 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-internal-tls-certs\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.940890 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-public-tls-certs\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.942304 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-config-data\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.947220 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-credential-keys\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:20 crc kubenswrapper[4851]: I1001 13:13:20.948197 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcs9t\" (UniqueName: \"kubernetes.io/projected/bdf629fa-5ac6-4985-a03a-c77c19cc9adb-kube-api-access-kcs9t\") pod \"keystone-78d78d4545-tv25n\" (UID: \"bdf629fa-5ac6-4985-a03a-c77c19cc9adb\") " pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:21 crc kubenswrapper[4851]: I1001 13:13:21.023574 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:21 crc kubenswrapper[4851]: I1001 13:13:21.080493 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 13:13:21 crc kubenswrapper[4851]: I1001 13:13:21.081147 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 13:13:21 crc kubenswrapper[4851]: I1001 13:13:21.135300 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 13:13:21 crc kubenswrapper[4851]: I1001 13:13:21.145700 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 13:13:21 crc kubenswrapper[4851]: I1001 13:13:21.684562 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 13:13:21 crc kubenswrapper[4851]: I1001 13:13:21.684751 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.693384 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gk28q" event={"ID":"a1eb277b-5338-4d54-939f-6d636f34d14c","Type":"ContainerDied","Data":"05712b7c6c410acd5bf063c6c7775e8bc4ed3a273cc93206c564ecf7561994a7"} Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.693795 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05712b7c6c410acd5bf063c6c7775e8bc4ed3a273cc93206c564ecf7561994a7" Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.737338 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gk28q" Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.782851 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1eb277b-5338-4d54-939f-6d636f34d14c-logs\") pod \"a1eb277b-5338-4d54-939f-6d636f34d14c\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.782969 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-combined-ca-bundle\") pod \"a1eb277b-5338-4d54-939f-6d636f34d14c\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.783130 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-625qw\" (UniqueName: \"kubernetes.io/projected/a1eb277b-5338-4d54-939f-6d636f34d14c-kube-api-access-625qw\") pod \"a1eb277b-5338-4d54-939f-6d636f34d14c\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.783153 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-config-data\") pod \"a1eb277b-5338-4d54-939f-6d636f34d14c\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.783173 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-scripts\") pod \"a1eb277b-5338-4d54-939f-6d636f34d14c\" (UID: \"a1eb277b-5338-4d54-939f-6d636f34d14c\") " Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.784736 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1eb277b-5338-4d54-939f-6d636f34d14c-logs" (OuterVolumeSpecName: "logs") pod "a1eb277b-5338-4d54-939f-6d636f34d14c" (UID: "a1eb277b-5338-4d54-939f-6d636f34d14c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.804520 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1eb277b-5338-4d54-939f-6d636f34d14c-kube-api-access-625qw" (OuterVolumeSpecName: "kube-api-access-625qw") pod "a1eb277b-5338-4d54-939f-6d636f34d14c" (UID: "a1eb277b-5338-4d54-939f-6d636f34d14c"). InnerVolumeSpecName "kube-api-access-625qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.804544 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-scripts" (OuterVolumeSpecName: "scripts") pod "a1eb277b-5338-4d54-939f-6d636f34d14c" (UID: "a1eb277b-5338-4d54-939f-6d636f34d14c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.821567 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-config-data" (OuterVolumeSpecName: "config-data") pod "a1eb277b-5338-4d54-939f-6d636f34d14c" (UID: "a1eb277b-5338-4d54-939f-6d636f34d14c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.835443 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1eb277b-5338-4d54-939f-6d636f34d14c" (UID: "a1eb277b-5338-4d54-939f-6d636f34d14c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.885231 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.885274 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-625qw\" (UniqueName: \"kubernetes.io/projected/a1eb277b-5338-4d54-939f-6d636f34d14c-kube-api-access-625qw\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.885291 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.885303 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1eb277b-5338-4d54-939f-6d636f34d14c-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:22 crc kubenswrapper[4851]: I1001 13:13:22.885316 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1eb277b-5338-4d54-939f-6d636f34d14c-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.682148 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.684141 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.707409 4851 generic.go:334] "Generic (PLEG): container finished" podID="d64a19f8-04aa-4c82-9818-6a50d9e3d62b" containerID="8ba017052fe7382141520764bcb6eefb2e1e996457cd67073bd2717c8ca8b391" exitCode=0 Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.708776 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l9xpp" event={"ID":"d64a19f8-04aa-4c82-9818-6a50d9e3d62b","Type":"ContainerDied","Data":"8ba017052fe7382141520764bcb6eefb2e1e996457cd67073bd2717c8ca8b391"} Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.714484 4851 generic.go:334] "Generic (PLEG): container finished" podID="971ca0ac-6de7-42f1-bf29-5174fd80ced4" containerID="d55abae016177de4dd6c005c3c19e8a7fcaf164791a47b7bbc2c48b5523a7265" exitCode=1 Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.714576 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gk28q" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.715796 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"971ca0ac-6de7-42f1-bf29-5174fd80ced4","Type":"ContainerDied","Data":"d55abae016177de4dd6c005c3c19e8a7fcaf164791a47b7bbc2c48b5523a7265"} Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.716608 4851 scope.go:117] "RemoveContainer" containerID="d55abae016177de4dd6c005c3c19e8a7fcaf164791a47b7bbc2c48b5523a7265" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.860969 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7c6fb58db4-9tzr2"] Oct 01 13:13:23 crc kubenswrapper[4851]: E1001 13:13:23.861715 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1eb277b-5338-4d54-939f-6d636f34d14c" containerName="placement-db-sync" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.861733 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1eb277b-5338-4d54-939f-6d636f34d14c" containerName="placement-db-sync" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.861978 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1eb277b-5338-4d54-939f-6d636f34d14c" containerName="placement-db-sync" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.863224 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.865172 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.865308 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.865385 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.865549 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qz7dn" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.865750 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.885315 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7c6fb58db4-9tzr2"] Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.901716 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-combined-ca-bundle\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.901769 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-scripts\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.901825 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2845afcf-be46-4aad-a15a-79c3a54b844c-logs\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.901867 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-config-data\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.901912 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-internal-tls-certs\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.901941 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndcv9\" (UniqueName: \"kubernetes.io/projected/2845afcf-be46-4aad-a15a-79c3a54b844c-kube-api-access-ndcv9\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:23 crc kubenswrapper[4851]: I1001 13:13:23.901979 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-public-tls-certs\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.005414 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2845afcf-be46-4aad-a15a-79c3a54b844c-logs\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.005517 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-config-data\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.005580 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-internal-tls-certs\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.005618 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndcv9\" (UniqueName: \"kubernetes.io/projected/2845afcf-be46-4aad-a15a-79c3a54b844c-kube-api-access-ndcv9\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.005665 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-public-tls-certs\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.005702 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-combined-ca-bundle\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.005737 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-scripts\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.005845 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2845afcf-be46-4aad-a15a-79c3a54b844c-logs\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.011668 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-internal-tls-certs\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.013637 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-public-tls-certs\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.013835 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-combined-ca-bundle\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.024440 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-scripts\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.024708 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndcv9\" (UniqueName: \"kubernetes.io/projected/2845afcf-be46-4aad-a15a-79c3a54b844c-kube-api-access-ndcv9\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.029132 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2845afcf-be46-4aad-a15a-79c3a54b844c-config-data\") pod \"placement-7c6fb58db4-9tzr2\" (UID: \"2845afcf-be46-4aad-a15a-79c3a54b844c\") " pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:24 crc kubenswrapper[4851]: I1001 13:13:24.200740 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.083261 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l9xpp" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.137250 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-db-sync-config-data\") pod \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\" (UID: \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\") " Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.137297 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-combined-ca-bundle\") pod \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\" (UID: \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\") " Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.137396 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz85j\" (UniqueName: \"kubernetes.io/projected/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-kube-api-access-nz85j\") pod \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\" (UID: \"d64a19f8-04aa-4c82-9818-6a50d9e3d62b\") " Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.147652 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d64a19f8-04aa-4c82-9818-6a50d9e3d62b" (UID: "d64a19f8-04aa-4c82-9818-6a50d9e3d62b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.147653 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-kube-api-access-nz85j" (OuterVolumeSpecName: "kube-api-access-nz85j") pod "d64a19f8-04aa-4c82-9818-6a50d9e3d62b" (UID: "d64a19f8-04aa-4c82-9818-6a50d9e3d62b"). InnerVolumeSpecName "kube-api-access-nz85j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.179680 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d64a19f8-04aa-4c82-9818-6a50d9e3d62b" (UID: "d64a19f8-04aa-4c82-9818-6a50d9e3d62b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.239146 4851 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.239188 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.239200 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz85j\" (UniqueName: \"kubernetes.io/projected/d64a19f8-04aa-4c82-9818-6a50d9e3d62b-kube-api-access-nz85j\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.247593 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.247653 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.247672 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.247683 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.328825 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7c6fb58db4-9tzr2"] Oct 01 13:13:25 crc kubenswrapper[4851]: W1001 13:13:25.335007 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2845afcf_be46_4aad_a15a_79c3a54b844c.slice/crio-3e91d713fb04bc2c7e3afb5f87127eac9ef7738733389e44e8dcefae4eb13a8c WatchSource:0}: Error finding container 3e91d713fb04bc2c7e3afb5f87127eac9ef7738733389e44e8dcefae4eb13a8c: Status 404 returned error can't find the container with id 3e91d713fb04bc2c7e3afb5f87127eac9ef7738733389e44e8dcefae4eb13a8c Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.350954 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.356425 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.373351 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.410882 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-78d78d4545-tv25n"] Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.413737 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.735920 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7c6fb58db4-9tzr2" event={"ID":"2845afcf-be46-4aad-a15a-79c3a54b844c","Type":"ContainerStarted","Data":"9ce05a1cc316c68b30719e99f9bcab3ee920fcd8ab80714bf630c16b1befe96b"} Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.736207 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7c6fb58db4-9tzr2" event={"ID":"2845afcf-be46-4aad-a15a-79c3a54b844c","Type":"ContainerStarted","Data":"3e91d713fb04bc2c7e3afb5f87127eac9ef7738733389e44e8dcefae4eb13a8c"} Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.736251 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.736292 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.739208 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55afd90b-7461-4db7-89a6-d45f9bdcb1b3","Type":"ContainerStarted","Data":"5c73849f60fb43fb34c8f621f09a4a436c73dd24d7472e776293c138336b62ea"} Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.741421 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l9xpp" event={"ID":"d64a19f8-04aa-4c82-9818-6a50d9e3d62b","Type":"ContainerDied","Data":"e4a120bcaef0c513819c5fcfbecbb7aa4b5227783ecfeffa0bf37bcbabd17c49"} Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.741447 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4a120bcaef0c513819c5fcfbecbb7aa4b5227783ecfeffa0bf37bcbabd17c49" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.741491 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l9xpp" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.743231 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"971ca0ac-6de7-42f1-bf29-5174fd80ced4","Type":"ContainerStarted","Data":"e24b7de9a6b59f816add29e99ed4c97b0171d8f00ee2b1591eb4620a52cff2dd"} Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.751162 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-78d78d4545-tv25n" event={"ID":"bdf629fa-5ac6-4985-a03a-c77c19cc9adb","Type":"ContainerStarted","Data":"295484afd0b739f94b5d2db85dc94f8568c8c56fca19a3729989ea68622cd2a5"} Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.751223 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.751237 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-78d78d4545-tv25n" event={"ID":"bdf629fa-5ac6-4985-a03a-c77c19cc9adb","Type":"ContainerStarted","Data":"a2eded1036751e3b77f1a7b7df0e4e53b1260a7df5614fc683d3c1fa052021b9"} Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.761938 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.767676 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7c6fb58db4-9tzr2" podStartSLOduration=2.767656402 podStartE2EDuration="2.767656402s" podCreationTimestamp="2025-10-01 13:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:25.756137903 +0000 UTC m=+1214.101255419" watchObservedRunningTime="2025-10-01 13:13:25.767656402 +0000 UTC m=+1214.112773888" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.791705 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.799743 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-78d78d4545-tv25n" podStartSLOduration=5.799706527 podStartE2EDuration="5.799706527s" podCreationTimestamp="2025-10-01 13:13:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:25.784123712 +0000 UTC m=+1214.129241198" watchObservedRunningTime="2025-10-01 13:13:25.799706527 +0000 UTC m=+1214.144824013" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.865605 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.865692 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.965783 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:13:25 crc kubenswrapper[4851]: I1001 13:13:25.966142 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.088391 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7f7d86b4d6-9685z"] Oct 01 13:13:26 crc kubenswrapper[4851]: E1001 13:13:26.088847 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64a19f8-04aa-4c82-9818-6a50d9e3d62b" containerName="barbican-db-sync" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.088875 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64a19f8-04aa-4c82-9818-6a50d9e3d62b" containerName="barbican-db-sync" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.089074 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64a19f8-04aa-4c82-9818-6a50d9e3d62b" containerName="barbican-db-sync" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.090019 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.094509 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.104901 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-dpls5" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.105118 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.114609 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f7d86b4d6-9685z"] Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.130628 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6c49b89697-kzlf9"] Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.132011 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.143167 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.144309 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6c49b89697-kzlf9"] Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.157317 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-956c876c5-cvlqz"] Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.163310 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.172520 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4fb21634-7084-49a9-88d7-8759e5c794cb-config-data-custom\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.172561 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83d01de2-5036-447d-9bcc-adae38fc5202-config-data-custom\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.173731 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d01de2-5036-447d-9bcc-adae38fc5202-logs\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.173836 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb21634-7084-49a9-88d7-8759e5c794cb-combined-ca-bundle\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.173912 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6642j\" (UniqueName: \"kubernetes.io/projected/83d01de2-5036-447d-9bcc-adae38fc5202-kube-api-access-6642j\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.173934 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d01de2-5036-447d-9bcc-adae38fc5202-combined-ca-bundle\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.174023 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt4ml\" (UniqueName: \"kubernetes.io/projected/4fb21634-7084-49a9-88d7-8759e5c794cb-kube-api-access-wt4ml\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.174048 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fb21634-7084-49a9-88d7-8759e5c794cb-logs\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.174076 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d01de2-5036-447d-9bcc-adae38fc5202-config-data\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.174110 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb21634-7084-49a9-88d7-8759e5c794cb-config-data\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.174203 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-956c876c5-cvlqz"] Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275565 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb21634-7084-49a9-88d7-8759e5c794cb-config-data\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275618 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-config\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275640 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-dns-swift-storage-0\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275670 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4fb21634-7084-49a9-88d7-8759e5c794cb-config-data-custom\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275695 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83d01de2-5036-447d-9bcc-adae38fc5202-config-data-custom\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275715 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d01de2-5036-447d-9bcc-adae38fc5202-logs\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275781 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf6pf\" (UniqueName: \"kubernetes.io/projected/97ea855a-7426-4e7d-af3d-c7d498622629-kube-api-access-qf6pf\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275799 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb21634-7084-49a9-88d7-8759e5c794cb-combined-ca-bundle\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275841 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6642j\" (UniqueName: \"kubernetes.io/projected/83d01de2-5036-447d-9bcc-adae38fc5202-kube-api-access-6642j\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275860 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-ovsdbserver-sb\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275876 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d01de2-5036-447d-9bcc-adae38fc5202-combined-ca-bundle\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275918 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-dns-svc\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275948 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt4ml\" (UniqueName: \"kubernetes.io/projected/4fb21634-7084-49a9-88d7-8759e5c794cb-kube-api-access-wt4ml\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275970 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fb21634-7084-49a9-88d7-8759e5c794cb-logs\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.275991 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-ovsdbserver-nb\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.276035 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d01de2-5036-447d-9bcc-adae38fc5202-config-data\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.276972 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d01de2-5036-447d-9bcc-adae38fc5202-logs\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.277774 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fb21634-7084-49a9-88d7-8759e5c794cb-logs\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.281720 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4fb21634-7084-49a9-88d7-8759e5c794cb-config-data-custom\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.283842 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83d01de2-5036-447d-9bcc-adae38fc5202-config-data-custom\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.287076 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d01de2-5036-447d-9bcc-adae38fc5202-config-data\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.295470 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d01de2-5036-447d-9bcc-adae38fc5202-combined-ca-bundle\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.296556 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb21634-7084-49a9-88d7-8759e5c794cb-config-data\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.310611 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb21634-7084-49a9-88d7-8759e5c794cb-combined-ca-bundle\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.317055 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6642j\" (UniqueName: \"kubernetes.io/projected/83d01de2-5036-447d-9bcc-adae38fc5202-kube-api-access-6642j\") pod \"barbican-keystone-listener-7f7d86b4d6-9685z\" (UID: \"83d01de2-5036-447d-9bcc-adae38fc5202\") " pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.317730 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt4ml\" (UniqueName: \"kubernetes.io/projected/4fb21634-7084-49a9-88d7-8759e5c794cb-kube-api-access-wt4ml\") pod \"barbican-worker-6c49b89697-kzlf9\" (UID: \"4fb21634-7084-49a9-88d7-8759e5c794cb\") " pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.360889 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-f9dc8947d-h5tw8"] Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.362263 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.364538 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.370909 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-f9dc8947d-h5tw8"] Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.381939 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-config\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.381985 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-dns-swift-storage-0\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.382044 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf6pf\" (UniqueName: \"kubernetes.io/projected/97ea855a-7426-4e7d-af3d-c7d498622629-kube-api-access-qf6pf\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.382093 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-ovsdbserver-sb\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.382134 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-dns-svc\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.382172 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-ovsdbserver-nb\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.385474 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-ovsdbserver-nb\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.385511 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-config\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.385907 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-dns-svc\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.388886 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-ovsdbserver-sb\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.388899 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-dns-swift-storage-0\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.417911 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf6pf\" (UniqueName: \"kubernetes.io/projected/97ea855a-7426-4e7d-af3d-c7d498622629-kube-api-access-qf6pf\") pod \"dnsmasq-dns-956c876c5-cvlqz\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.467896 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.483807 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f40427b-938d-4f47-a126-b05108b109a6-logs\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.483951 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg8q7\" (UniqueName: \"kubernetes.io/projected/6f40427b-938d-4f47-a126-b05108b109a6-kube-api-access-kg8q7\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.484003 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-config-data-custom\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.484079 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-combined-ca-bundle\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.484134 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-config-data\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.487526 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6c49b89697-kzlf9" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.517022 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.585870 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg8q7\" (UniqueName: \"kubernetes.io/projected/6f40427b-938d-4f47-a126-b05108b109a6-kube-api-access-kg8q7\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.585931 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-config-data-custom\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.586005 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-combined-ca-bundle\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.586057 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-config-data\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.586108 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f40427b-938d-4f47-a126-b05108b109a6-logs\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.586752 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f40427b-938d-4f47-a126-b05108b109a6-logs\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.590292 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-combined-ca-bundle\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.603985 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-config-data-custom\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.604440 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-config-data\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.607423 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg8q7\" (UniqueName: \"kubernetes.io/projected/6f40427b-938d-4f47-a126-b05108b109a6-kube-api-access-kg8q7\") pod \"barbican-api-f9dc8947d-h5tw8\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.713155 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:26 crc kubenswrapper[4851]: I1001 13:13:26.760915 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7c6fb58db4-9tzr2" event={"ID":"2845afcf-be46-4aad-a15a-79c3a54b844c","Type":"ContainerStarted","Data":"7afefccf2910d214ad5a54e66a08eb6c1126bc6d34634d8cb0d8bb1b1e5183d9"} Oct 01 13:13:27 crc kubenswrapper[4851]: I1001 13:13:27.978866 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f7d86b4d6-9685z"] Oct 01 13:13:28 crc kubenswrapper[4851]: W1001 13:13:28.014966 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83d01de2_5036_447d_9bcc_adae38fc5202.slice/crio-30916b7bc2994f3895552b0fa5ccb849a8f5a904d0ae92137a9815296509306d WatchSource:0}: Error finding container 30916b7bc2994f3895552b0fa5ccb849a8f5a904d0ae92137a9815296509306d: Status 404 returned error can't find the container with id 30916b7bc2994f3895552b0fa5ccb849a8f5a904d0ae92137a9815296509306d Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.026666 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-956c876c5-cvlqz"] Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.026721 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.026831 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.095017 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-f9dc8947d-h5tw8"] Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.101206 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.143451 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.204553 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6c49b89697-kzlf9"] Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.854436 4851 generic.go:334] "Generic (PLEG): container finished" podID="97ea855a-7426-4e7d-af3d-c7d498622629" containerID="eff1eea81bdcd588724281cd7d2b7364d414e44be86ee8e76d18498267e2e2e8" exitCode=0 Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.854779 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-956c876c5-cvlqz" event={"ID":"97ea855a-7426-4e7d-af3d-c7d498622629","Type":"ContainerDied","Data":"eff1eea81bdcd588724281cd7d2b7364d414e44be86ee8e76d18498267e2e2e8"} Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.854821 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-956c876c5-cvlqz" event={"ID":"97ea855a-7426-4e7d-af3d-c7d498622629","Type":"ContainerStarted","Data":"a003ca1070f8f6c76a127673f3fc7252102d62cd280ad08014eea7f85526f4d3"} Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.870694 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" event={"ID":"83d01de2-5036-447d-9bcc-adae38fc5202","Type":"ContainerStarted","Data":"30916b7bc2994f3895552b0fa5ccb849a8f5a904d0ae92137a9815296509306d"} Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.890618 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6c49b89697-kzlf9" event={"ID":"4fb21634-7084-49a9-88d7-8759e5c794cb","Type":"ContainerStarted","Data":"ec3d5b4e653256700deeb9df4ac02132fa6b664bfc1647a9c1d4ec5d4c261cd4"} Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.935302 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f9dc8947d-h5tw8" event={"ID":"6f40427b-938d-4f47-a126-b05108b109a6","Type":"ContainerStarted","Data":"1ac11f9ec49b84b6c3f28d475f746db4561b66e46bc0916c55d245282b87baba"} Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.935339 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f9dc8947d-h5tw8" event={"ID":"6f40427b-938d-4f47-a126-b05108b109a6","Type":"ContainerStarted","Data":"2287b7f9f413969d7ea20726208c432678c43877c69e20649bcd6bbbcc5f4341"} Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.938380 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 13:13:28 crc kubenswrapper[4851]: I1001 13:13:28.944429 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.391199 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-79bfcf4f68-4nz4v"] Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.393288 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.397453 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.397683 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.435455 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79bfcf4f68-4nz4v"] Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.571661 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hswh\" (UniqueName: \"kubernetes.io/projected/62a44312-8062-4c82-ab00-f87600fa8f93-kube-api-access-9hswh\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.571722 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-public-tls-certs\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.571760 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-internal-tls-certs\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.571789 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62a44312-8062-4c82-ab00-f87600fa8f93-logs\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.571807 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-config-data-custom\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.571904 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-config-data\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.571949 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-combined-ca-bundle\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.673603 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-combined-ca-bundle\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.673799 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hswh\" (UniqueName: \"kubernetes.io/projected/62a44312-8062-4c82-ab00-f87600fa8f93-kube-api-access-9hswh\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.673828 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-public-tls-certs\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.673862 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-internal-tls-certs\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.673895 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62a44312-8062-4c82-ab00-f87600fa8f93-logs\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.673920 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-config-data-custom\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.674120 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-config-data\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.675319 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62a44312-8062-4c82-ab00-f87600fa8f93-logs\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.679732 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-public-tls-certs\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.680272 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-combined-ca-bundle\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.681742 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-config-data-custom\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.682337 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-internal-tls-certs\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.694133 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62a44312-8062-4c82-ab00-f87600fa8f93-config-data\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.714091 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hswh\" (UniqueName: \"kubernetes.io/projected/62a44312-8062-4c82-ab00-f87600fa8f93-kube-api-access-9hswh\") pod \"barbican-api-79bfcf4f68-4nz4v\" (UID: \"62a44312-8062-4c82-ab00-f87600fa8f93\") " pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.781577 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.966905 4851 generic.go:334] "Generic (PLEG): container finished" podID="971ca0ac-6de7-42f1-bf29-5174fd80ced4" containerID="e24b7de9a6b59f816add29e99ed4c97b0171d8f00ee2b1591eb4620a52cff2dd" exitCode=1 Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.967092 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"971ca0ac-6de7-42f1-bf29-5174fd80ced4","Type":"ContainerDied","Data":"e24b7de9a6b59f816add29e99ed4c97b0171d8f00ee2b1591eb4620a52cff2dd"} Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.967187 4851 scope.go:117] "RemoveContainer" containerID="d55abae016177de4dd6c005c3c19e8a7fcaf164791a47b7bbc2c48b5523a7265" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.968335 4851 scope.go:117] "RemoveContainer" containerID="e24b7de9a6b59f816add29e99ed4c97b0171d8f00ee2b1591eb4620a52cff2dd" Oct 01 13:13:29 crc kubenswrapper[4851]: E1001 13:13:29.970642 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(971ca0ac-6de7-42f1-bf29-5174fd80ced4)\"" pod="openstack/watcher-decision-engine-0" podUID="971ca0ac-6de7-42f1-bf29-5174fd80ced4" Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.976124 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9zwxh" event={"ID":"08f27829-7a4b-49d3-aed3-dbae56854228","Type":"ContainerStarted","Data":"138e17912a61a9728505be01bffd19e9f6001144eb35aa8790beea2649d9e8d0"} Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.987473 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-956c876c5-cvlqz" event={"ID":"97ea855a-7426-4e7d-af3d-c7d498622629","Type":"ContainerStarted","Data":"6597334ee09778a6d3effb76f5dbb9fec5d03ff96d7aa500cb7462e5a64f260a"} Oct 01 13:13:29 crc kubenswrapper[4851]: I1001 13:13:29.988281 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:30 crc kubenswrapper[4851]: I1001 13:13:30.011090 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f9dc8947d-h5tw8" event={"ID":"6f40427b-938d-4f47-a126-b05108b109a6","Type":"ContainerStarted","Data":"4cb73528187ef34285135d104b2ba7bbcfb39a5caac97fa3c33ea0f0ba1fcbfe"} Oct 01 13:13:30 crc kubenswrapper[4851]: I1001 13:13:30.011170 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:30 crc kubenswrapper[4851]: I1001 13:13:30.011342 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:30 crc kubenswrapper[4851]: I1001 13:13:30.022067 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9zwxh" podStartSLOduration=3.67954886 podStartE2EDuration="1m7.022041695s" podCreationTimestamp="2025-10-01 13:12:23 +0000 UTC" firstStartedPulling="2025-10-01 13:12:25.074966896 +0000 UTC m=+1153.420084382" lastFinishedPulling="2025-10-01 13:13:28.417459731 +0000 UTC m=+1216.762577217" observedRunningTime="2025-10-01 13:13:30.009001713 +0000 UTC m=+1218.354119199" watchObservedRunningTime="2025-10-01 13:13:30.022041695 +0000 UTC m=+1218.367159181" Oct 01 13:13:30 crc kubenswrapper[4851]: I1001 13:13:30.040530 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-956c876c5-cvlqz" podStartSLOduration=4.040485261 podStartE2EDuration="4.040485261s" podCreationTimestamp="2025-10-01 13:13:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:30.035221061 +0000 UTC m=+1218.380338547" watchObservedRunningTime="2025-10-01 13:13:30.040485261 +0000 UTC m=+1218.385602747" Oct 01 13:13:30 crc kubenswrapper[4851]: I1001 13:13:30.057214 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-f9dc8947d-h5tw8" podStartSLOduration=4.057198368 podStartE2EDuration="4.057198368s" podCreationTimestamp="2025-10-01 13:13:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:30.054326016 +0000 UTC m=+1218.399443502" watchObservedRunningTime="2025-10-01 13:13:30.057198368 +0000 UTC m=+1218.402315854" Oct 01 13:13:31 crc kubenswrapper[4851]: I1001 13:13:31.023904 4851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:13:31 crc kubenswrapper[4851]: I1001 13:13:31.024152 4851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:13:31 crc kubenswrapper[4851]: I1001 13:13:31.467267 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 13:13:31 crc kubenswrapper[4851]: I1001 13:13:31.611323 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 13:13:31 crc kubenswrapper[4851]: I1001 13:13:31.813826 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79bfcf4f68-4nz4v"] Oct 01 13:13:32 crc kubenswrapper[4851]: I1001 13:13:32.041987 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6c49b89697-kzlf9" event={"ID":"4fb21634-7084-49a9-88d7-8759e5c794cb","Type":"ContainerStarted","Data":"dd2af34d64ca62535af079b946e58b6bc4343d403327b41102c5c0ac41b839af"} Oct 01 13:13:32 crc kubenswrapper[4851]: I1001 13:13:32.042035 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6c49b89697-kzlf9" event={"ID":"4fb21634-7084-49a9-88d7-8759e5c794cb","Type":"ContainerStarted","Data":"24d50f9bc53b00ecb9c26ddef55040bd4154928103e95245b0030ba5236777ce"} Oct 01 13:13:32 crc kubenswrapper[4851]: I1001 13:13:32.052029 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" event={"ID":"83d01de2-5036-447d-9bcc-adae38fc5202","Type":"ContainerStarted","Data":"999c11c43b9a995c5d6bc9dffc4ccfae80d8cc58520163517d68dac18ea1ba46"} Oct 01 13:13:32 crc kubenswrapper[4851]: I1001 13:13:32.052071 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" event={"ID":"83d01de2-5036-447d-9bcc-adae38fc5202","Type":"ContainerStarted","Data":"b55f83a6207ce90f334528182109410e108fd2ae569cf02ffef2a01049aa9acd"} Oct 01 13:13:32 crc kubenswrapper[4851]: I1001 13:13:32.057476 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79bfcf4f68-4nz4v" event={"ID":"62a44312-8062-4c82-ab00-f87600fa8f93","Type":"ContainerStarted","Data":"3567b8a0e7a7242c8c4b6887381e94f3b722aa1273e892389dd5b176d5594a1c"} Oct 01 13:13:32 crc kubenswrapper[4851]: I1001 13:13:32.057540 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79bfcf4f68-4nz4v" event={"ID":"62a44312-8062-4c82-ab00-f87600fa8f93","Type":"ContainerStarted","Data":"e069e3880627b7fd297cf7f25542296b086a0a0991cfe6c36e2a6bf3c2179ea8"} Oct 01 13:13:32 crc kubenswrapper[4851]: I1001 13:13:32.069701 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6c49b89697-kzlf9" podStartSLOduration=3.004699035 podStartE2EDuration="6.069684255s" podCreationTimestamp="2025-10-01 13:13:26 +0000 UTC" firstStartedPulling="2025-10-01 13:13:28.204977316 +0000 UTC m=+1216.550094802" lastFinishedPulling="2025-10-01 13:13:31.269962536 +0000 UTC m=+1219.615080022" observedRunningTime="2025-10-01 13:13:32.061701147 +0000 UTC m=+1220.406818633" watchObservedRunningTime="2025-10-01 13:13:32.069684255 +0000 UTC m=+1220.414801751" Oct 01 13:13:32 crc kubenswrapper[4851]: I1001 13:13:32.083448 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7f7d86b4d6-9685z" podStartSLOduration=2.853701994 podStartE2EDuration="6.083433767s" podCreationTimestamp="2025-10-01 13:13:26 +0000 UTC" firstStartedPulling="2025-10-01 13:13:28.037397712 +0000 UTC m=+1216.382515198" lastFinishedPulling="2025-10-01 13:13:31.267129485 +0000 UTC m=+1219.612246971" observedRunningTime="2025-10-01 13:13:32.080184484 +0000 UTC m=+1220.425301970" watchObservedRunningTime="2025-10-01 13:13:32.083433767 +0000 UTC m=+1220.428551253" Oct 01 13:13:32 crc kubenswrapper[4851]: I1001 13:13:32.923837 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:13:32 crc kubenswrapper[4851]: I1001 13:13:32.924627 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="d713b6ba-8532-4b3c-aa32-c97d5547ea62" containerName="watcher-api" containerID="cri-o://e876dc9b41583fe6cd743a4b16f165959cc10fd52c92e16f3cd1a98200d6c5c1" gracePeriod=30 Oct 01 13:13:32 crc kubenswrapper[4851]: I1001 13:13:32.932100 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="d713b6ba-8532-4b3c-aa32-c97d5547ea62" containerName="watcher-api-log" containerID="cri-o://2543525a6144b684b2a662ffa5887481abaea2d9e7bcb0361bfdff0ad91e3504" gracePeriod=30 Oct 01 13:13:33 crc kubenswrapper[4851]: I1001 13:13:33.070598 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79bfcf4f68-4nz4v" event={"ID":"62a44312-8062-4c82-ab00-f87600fa8f93","Type":"ContainerStarted","Data":"d0e83efb45690d6fb2308498c3205fa19625ab8c9b9411cff54dca86ce0d6015"} Oct 01 13:13:33 crc kubenswrapper[4851]: I1001 13:13:33.071706 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:33 crc kubenswrapper[4851]: I1001 13:13:33.071732 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:33 crc kubenswrapper[4851]: I1001 13:13:33.087450 4851 generic.go:334] "Generic (PLEG): container finished" podID="d713b6ba-8532-4b3c-aa32-c97d5547ea62" containerID="2543525a6144b684b2a662ffa5887481abaea2d9e7bcb0361bfdff0ad91e3504" exitCode=143 Oct 01 13:13:33 crc kubenswrapper[4851]: I1001 13:13:33.087485 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d713b6ba-8532-4b3c-aa32-c97d5547ea62","Type":"ContainerDied","Data":"2543525a6144b684b2a662ffa5887481abaea2d9e7bcb0361bfdff0ad91e3504"} Oct 01 13:13:34 crc kubenswrapper[4851]: I1001 13:13:34.113599 4851 generic.go:334] "Generic (PLEG): container finished" podID="d713b6ba-8532-4b3c-aa32-c97d5547ea62" containerID="e876dc9b41583fe6cd743a4b16f165959cc10fd52c92e16f3cd1a98200d6c5c1" exitCode=0 Oct 01 13:13:34 crc kubenswrapper[4851]: I1001 13:13:34.113686 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d713b6ba-8532-4b3c-aa32-c97d5547ea62","Type":"ContainerDied","Data":"e876dc9b41583fe6cd743a4b16f165959cc10fd52c92e16f3cd1a98200d6c5c1"} Oct 01 13:13:35 crc kubenswrapper[4851]: I1001 13:13:35.247493 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:13:35 crc kubenswrapper[4851]: I1001 13:13:35.247853 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:13:35 crc kubenswrapper[4851]: I1001 13:13:35.248442 4851 scope.go:117] "RemoveContainer" containerID="e24b7de9a6b59f816add29e99ed4c97b0171d8f00ee2b1591eb4620a52cff2dd" Oct 01 13:13:35 crc kubenswrapper[4851]: E1001 13:13:35.248837 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(971ca0ac-6de7-42f1-bf29-5174fd80ced4)\"" pod="openstack/watcher-decision-engine-0" podUID="971ca0ac-6de7-42f1-bf29-5174fd80ced4" Oct 01 13:13:35 crc kubenswrapper[4851]: I1001 13:13:35.284635 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-79bfcf4f68-4nz4v" podStartSLOduration=6.284611285 podStartE2EDuration="6.284611285s" podCreationTimestamp="2025-10-01 13:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:33.101297613 +0000 UTC m=+1221.446415099" watchObservedRunningTime="2025-10-01 13:13:35.284611285 +0000 UTC m=+1223.629728771" Oct 01 13:13:36 crc kubenswrapper[4851]: I1001 13:13:36.519883 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:13:36 crc kubenswrapper[4851]: I1001 13:13:36.597042 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d98d6dddf-lzbs9"] Oct 01 13:13:36 crc kubenswrapper[4851]: I1001 13:13:36.597477 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" podUID="6b16af47-e71e-42f4-b45f-72afaaf0e13c" containerName="dnsmasq-dns" containerID="cri-o://3c334d0cc4506bef3d5cfc40c9419db0e3ef1d6b6268f6c032bc4f9a002a6599" gracePeriod=10 Oct 01 13:13:37 crc kubenswrapper[4851]: I1001 13:13:37.161881 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" event={"ID":"6b16af47-e71e-42f4-b45f-72afaaf0e13c","Type":"ContainerDied","Data":"3c334d0cc4506bef3d5cfc40c9419db0e3ef1d6b6268f6c032bc4f9a002a6599"} Oct 01 13:13:37 crc kubenswrapper[4851]: I1001 13:13:37.161978 4851 generic.go:334] "Generic (PLEG): container finished" podID="6b16af47-e71e-42f4-b45f-72afaaf0e13c" containerID="3c334d0cc4506bef3d5cfc40c9419db0e3ef1d6b6268f6c032bc4f9a002a6599" exitCode=0 Oct 01 13:13:37 crc kubenswrapper[4851]: I1001 13:13:37.937041 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.100332 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.121158 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.197001 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d713b6ba-8532-4b3c-aa32-c97d5547ea62","Type":"ContainerDied","Data":"793317acdf31b9e0cbe5771833bcd4a2f078f89809666297fa4c89b4d55d5ee4"} Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.197058 4851 scope.go:117] "RemoveContainer" containerID="e876dc9b41583fe6cd743a4b16f165959cc10fd52c92e16f3cd1a98200d6c5c1" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.197218 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.263980 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8hdv\" (UniqueName: \"kubernetes.io/projected/d713b6ba-8532-4b3c-aa32-c97d5547ea62-kube-api-access-x8hdv\") pod \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.264078 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-config-data\") pod \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.264137 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d713b6ba-8532-4b3c-aa32-c97d5547ea62-logs\") pod \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.264217 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-combined-ca-bundle\") pod \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.264242 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-custom-prometheus-ca\") pod \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\" (UID: \"d713b6ba-8532-4b3c-aa32-c97d5547ea62\") " Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.264949 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d713b6ba-8532-4b3c-aa32-c97d5547ea62-logs" (OuterVolumeSpecName: "logs") pod "d713b6ba-8532-4b3c-aa32-c97d5547ea62" (UID: "d713b6ba-8532-4b3c-aa32-c97d5547ea62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.266408 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d713b6ba-8532-4b3c-aa32-c97d5547ea62-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.274756 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d713b6ba-8532-4b3c-aa32-c97d5547ea62-kube-api-access-x8hdv" (OuterVolumeSpecName: "kube-api-access-x8hdv") pod "d713b6ba-8532-4b3c-aa32-c97d5547ea62" (UID: "d713b6ba-8532-4b3c-aa32-c97d5547ea62"). InnerVolumeSpecName "kube-api-access-x8hdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.303813 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d713b6ba-8532-4b3c-aa32-c97d5547ea62" (UID: "d713b6ba-8532-4b3c-aa32-c97d5547ea62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.339317 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-config-data" (OuterVolumeSpecName: "config-data") pod "d713b6ba-8532-4b3c-aa32-c97d5547ea62" (UID: "d713b6ba-8532-4b3c-aa32-c97d5547ea62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.357947 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "d713b6ba-8532-4b3c-aa32-c97d5547ea62" (UID: "d713b6ba-8532-4b3c-aa32-c97d5547ea62"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.368191 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.368221 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.368231 4851 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d713b6ba-8532-4b3c-aa32-c97d5547ea62-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.368240 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8hdv\" (UniqueName: \"kubernetes.io/projected/d713b6ba-8532-4b3c-aa32-c97d5547ea62-kube-api-access-x8hdv\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.534389 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.552343 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.565267 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:13:38 crc kubenswrapper[4851]: E1001 13:13:38.565844 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d713b6ba-8532-4b3c-aa32-c97d5547ea62" containerName="watcher-api-log" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.566025 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="d713b6ba-8532-4b3c-aa32-c97d5547ea62" containerName="watcher-api-log" Oct 01 13:13:38 crc kubenswrapper[4851]: E1001 13:13:38.566132 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d713b6ba-8532-4b3c-aa32-c97d5547ea62" containerName="watcher-api" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.566245 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="d713b6ba-8532-4b3c-aa32-c97d5547ea62" containerName="watcher-api" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.566544 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="d713b6ba-8532-4b3c-aa32-c97d5547ea62" containerName="watcher-api" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.566945 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="d713b6ba-8532-4b3c-aa32-c97d5547ea62" containerName="watcher-api-log" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.568242 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.570622 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh2l9\" (UniqueName: \"kubernetes.io/projected/bb173f87-ff1f-4a9f-9de8-5073545e2697-kube-api-access-dh2l9\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.570665 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-config-data\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.570689 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.570703 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.570752 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-public-tls-certs\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.570779 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb173f87-ff1f-4a9f-9de8-5073545e2697-logs\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.570837 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.571062 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.571290 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.576250 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.578109 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.671944 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh2l9\" (UniqueName: \"kubernetes.io/projected/bb173f87-ff1f-4a9f-9de8-5073545e2697-kube-api-access-dh2l9\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.671989 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-config-data\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.672011 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.672027 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.672081 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-public-tls-certs\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.672108 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb173f87-ff1f-4a9f-9de8-5073545e2697-logs\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.672171 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.672713 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb173f87-ff1f-4a9f-9de8-5073545e2697-logs\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.683435 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-public-tls-certs\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.683442 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.683476 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.684463 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.684699 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb173f87-ff1f-4a9f-9de8-5073545e2697-config-data\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.686155 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh2l9\" (UniqueName: \"kubernetes.io/projected/bb173f87-ff1f-4a9f-9de8-5073545e2697-kube-api-access-dh2l9\") pod \"watcher-api-0\" (UID: \"bb173f87-ff1f-4a9f-9de8-5073545e2697\") " pod="openstack/watcher-api-0" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.847735 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:38 crc kubenswrapper[4851]: I1001 13:13:38.894824 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.017028 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.087513 4851 scope.go:117] "RemoveContainer" containerID="2543525a6144b684b2a662ffa5887481abaea2d9e7bcb0361bfdff0ad91e3504" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.196472 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.280667 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" event={"ID":"6b16af47-e71e-42f4-b45f-72afaaf0e13c","Type":"ContainerDied","Data":"715f94edf35f0a3770f7ffbb228c4f4679cc50d9edeab538080d032e5400826c"} Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.280957 4851 scope.go:117] "RemoveContainer" containerID="3c334d0cc4506bef3d5cfc40c9419db0e3ef1d6b6268f6c032bc4f9a002a6599" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.280955 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d98d6dddf-lzbs9" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.300890 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-config\") pod \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.402919 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-ovsdbserver-sb\") pod \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.402968 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-dns-swift-storage-0\") pod \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.403005 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-778pm\" (UniqueName: \"kubernetes.io/projected/6b16af47-e71e-42f4-b45f-72afaaf0e13c-kube-api-access-778pm\") pod \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.403039 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-dns-svc\") pod \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.403240 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-ovsdbserver-nb\") pod \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\" (UID: \"6b16af47-e71e-42f4-b45f-72afaaf0e13c\") " Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.403396 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-config" (OuterVolumeSpecName: "config") pod "6b16af47-e71e-42f4-b45f-72afaaf0e13c" (UID: "6b16af47-e71e-42f4-b45f-72afaaf0e13c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.404042 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.413675 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b16af47-e71e-42f4-b45f-72afaaf0e13c-kube-api-access-778pm" (OuterVolumeSpecName: "kube-api-access-778pm") pod "6b16af47-e71e-42f4-b45f-72afaaf0e13c" (UID: "6b16af47-e71e-42f4-b45f-72afaaf0e13c"). InnerVolumeSpecName "kube-api-access-778pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.442663 4851 scope.go:117] "RemoveContainer" containerID="6cda9ad7e176ad79970fdfa33bc29213f0e4717a74ee8af168ee70bec411f0d1" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.508739 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-778pm\" (UniqueName: \"kubernetes.io/projected/6b16af47-e71e-42f4-b45f-72afaaf0e13c-kube-api-access-778pm\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.520033 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6b16af47-e71e-42f4-b45f-72afaaf0e13c" (UID: "6b16af47-e71e-42f4-b45f-72afaaf0e13c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.530005 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6b16af47-e71e-42f4-b45f-72afaaf0e13c" (UID: "6b16af47-e71e-42f4-b45f-72afaaf0e13c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.558951 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b16af47-e71e-42f4-b45f-72afaaf0e13c" (UID: "6b16af47-e71e-42f4-b45f-72afaaf0e13c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.572642 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6b16af47-e71e-42f4-b45f-72afaaf0e13c" (UID: "6b16af47-e71e-42f4-b45f-72afaaf0e13c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.612121 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.612154 4851 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.612164 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.612177 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b16af47-e71e-42f4-b45f-72afaaf0e13c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.646547 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d98d6dddf-lzbs9"] Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.653243 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d98d6dddf-lzbs9"] Oct 01 13:13:39 crc kubenswrapper[4851]: I1001 13:13:39.662620 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.124067 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-58c9859d68-bckn5" Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.233212 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67b666754b-b52ns"] Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.233433 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67b666754b-b52ns" podUID="1db63449-71cb-4fa2-86db-43a83a914643" containerName="horizon-log" containerID="cri-o://c39866bdf066b1da181c5a3017d9775a7100cc6f4794767d4519b16f3085d3af" gracePeriod=30 Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.233835 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67b666754b-b52ns" podUID="1db63449-71cb-4fa2-86db-43a83a914643" containerName="horizon" containerID="cri-o://1566f50774133ea95c6a0e9c324d2e5bcf12901648eb5a468022250e57e983e9" gracePeriod=30 Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.244575 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67b666754b-b52ns" podUID="1db63449-71cb-4fa2-86db-43a83a914643" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.303928 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"bb173f87-ff1f-4a9f-9de8-5073545e2697","Type":"ContainerStarted","Data":"740f0e9ed462ca8a42aa778e40ce030d03f79caa1d2e2e9bf3d4889d58307fc5"} Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.303970 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"bb173f87-ff1f-4a9f-9de8-5073545e2697","Type":"ContainerStarted","Data":"7d30135b9ec6c35d9b004e1983e9d917d0d26bd52603ab65790615c41063e7f3"} Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.311475 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55afd90b-7461-4db7-89a6-d45f9bdcb1b3","Type":"ContainerStarted","Data":"e285c4a1b5e56583514f9abc7a427ec82264d1b7e158b4d15c5da89f915ff7ea"} Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.311636 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="ceilometer-central-agent" containerID="cri-o://11a1c08f2113a358b5d22ecdf09d9cd74ac93459e063a5735048c02487b5cb3d" gracePeriod=30 Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.311867 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.312096 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="proxy-httpd" containerID="cri-o://e285c4a1b5e56583514f9abc7a427ec82264d1b7e158b4d15c5da89f915ff7ea" gracePeriod=30 Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.312142 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="sg-core" containerID="cri-o://5c73849f60fb43fb34c8f621f09a4a436c73dd24d7472e776293c138336b62ea" gracePeriod=30 Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.312185 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="ceilometer-notification-agent" containerID="cri-o://6a062c003a7de76b285243f92119681912715e0c2f6b637150600517b03f8579" gracePeriod=30 Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.348472 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b16af47-e71e-42f4-b45f-72afaaf0e13c" path="/var/lib/kubelet/pods/6b16af47-e71e-42f4-b45f-72afaaf0e13c/volumes" Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.351047 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d713b6ba-8532-4b3c-aa32-c97d5547ea62" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9322/\": dial tcp 10.217.0.164:9322: i/o timeout (Client.Timeout exceeded while awaiting headers)" Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.351093 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d713b6ba-8532-4b3c-aa32-c97d5547ea62" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.164:9322/\": dial tcp 10.217.0.164:9322: i/o timeout (Client.Timeout exceeded while awaiting headers)" Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.351191 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d713b6ba-8532-4b3c-aa32-c97d5547ea62" path="/var/lib/kubelet/pods/d713b6ba-8532-4b3c-aa32-c97d5547ea62/volumes" Oct 01 13:13:40 crc kubenswrapper[4851]: I1001 13:13:40.360553 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.581521246 podStartE2EDuration="1m17.360534199s" podCreationTimestamp="2025-10-01 13:12:23 +0000 UTC" firstStartedPulling="2025-10-01 13:12:24.538689908 +0000 UTC m=+1152.883807394" lastFinishedPulling="2025-10-01 13:13:39.317702861 +0000 UTC m=+1227.662820347" observedRunningTime="2025-10-01 13:13:40.332861809 +0000 UTC m=+1228.677979295" watchObservedRunningTime="2025-10-01 13:13:40.360534199 +0000 UTC m=+1228.705651685" Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.349261 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"bb173f87-ff1f-4a9f-9de8-5073545e2697","Type":"ContainerStarted","Data":"bc0ec3bda91c68145d44c89b6c1b501f1111df08c474987ee48ebbc149190db8"} Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.349841 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.353471 4851 generic.go:334] "Generic (PLEG): container finished" podID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerID="e285c4a1b5e56583514f9abc7a427ec82264d1b7e158b4d15c5da89f915ff7ea" exitCode=0 Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.353519 4851 generic.go:334] "Generic (PLEG): container finished" podID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerID="5c73849f60fb43fb34c8f621f09a4a436c73dd24d7472e776293c138336b62ea" exitCode=2 Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.353530 4851 generic.go:334] "Generic (PLEG): container finished" podID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerID="11a1c08f2113a358b5d22ecdf09d9cd74ac93459e063a5735048c02487b5cb3d" exitCode=0 Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.353536 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55afd90b-7461-4db7-89a6-d45f9bdcb1b3","Type":"ContainerDied","Data":"e285c4a1b5e56583514f9abc7a427ec82264d1b7e158b4d15c5da89f915ff7ea"} Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.353584 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55afd90b-7461-4db7-89a6-d45f9bdcb1b3","Type":"ContainerDied","Data":"5c73849f60fb43fb34c8f621f09a4a436c73dd24d7472e776293c138336b62ea"} Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.353603 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55afd90b-7461-4db7-89a6-d45f9bdcb1b3","Type":"ContainerDied","Data":"11a1c08f2113a358b5d22ecdf09d9cd74ac93459e063a5735048c02487b5cb3d"} Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.393670 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.39364119 podStartE2EDuration="3.39364119s" podCreationTimestamp="2025-10-01 13:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:41.376052198 +0000 UTC m=+1229.721169734" watchObservedRunningTime="2025-10-01 13:13:41.39364119 +0000 UTC m=+1229.738758686" Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.461449 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.636164 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79bfcf4f68-4nz4v" Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.715038 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-f9dc8947d-h5tw8"] Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.715223 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-f9dc8947d-h5tw8" podUID="6f40427b-938d-4f47-a126-b05108b109a6" containerName="barbican-api-log" containerID="cri-o://1ac11f9ec49b84b6c3f28d475f746db4561b66e46bc0916c55d245282b87baba" gracePeriod=30 Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.715587 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-f9dc8947d-h5tw8" podUID="6f40427b-938d-4f47-a126-b05108b109a6" containerName="barbican-api" containerID="cri-o://4cb73528187ef34285135d104b2ba7bbcfb39a5caac97fa3c33ea0f0ba1fcbfe" gracePeriod=30 Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.728304 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-f9dc8947d-h5tw8" podUID="6f40427b-938d-4f47-a126-b05108b109a6" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": EOF" Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.728605 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-f9dc8947d-h5tw8" podUID="6f40427b-938d-4f47-a126-b05108b109a6" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": EOF" Oct 01 13:13:41 crc kubenswrapper[4851]: I1001 13:13:41.728837 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-f9dc8947d-h5tw8" podUID="6f40427b-938d-4f47-a126-b05108b109a6" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": EOF" Oct 01 13:13:42 crc kubenswrapper[4851]: I1001 13:13:42.363138 4851 generic.go:334] "Generic (PLEG): container finished" podID="6f40427b-938d-4f47-a126-b05108b109a6" containerID="1ac11f9ec49b84b6c3f28d475f746db4561b66e46bc0916c55d245282b87baba" exitCode=143 Oct 01 13:13:42 crc kubenswrapper[4851]: I1001 13:13:42.363229 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f9dc8947d-h5tw8" event={"ID":"6f40427b-938d-4f47-a126-b05108b109a6","Type":"ContainerDied","Data":"1ac11f9ec49b84b6c3f28d475f746db4561b66e46bc0916c55d245282b87baba"} Oct 01 13:13:42 crc kubenswrapper[4851]: I1001 13:13:42.365903 4851 generic.go:334] "Generic (PLEG): container finished" podID="08f27829-7a4b-49d3-aed3-dbae56854228" containerID="138e17912a61a9728505be01bffd19e9f6001144eb35aa8790beea2649d9e8d0" exitCode=0 Oct 01 13:13:42 crc kubenswrapper[4851]: I1001 13:13:42.365988 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9zwxh" event={"ID":"08f27829-7a4b-49d3-aed3-dbae56854228","Type":"ContainerDied","Data":"138e17912a61a9728505be01bffd19e9f6001144eb35aa8790beea2649d9e8d0"} Oct 01 13:13:42 crc kubenswrapper[4851]: I1001 13:13:42.370336 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67b666754b-b52ns" podUID="1db63449-71cb-4fa2-86db-43a83a914643" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:52706->10.217.0.160:8443: read: connection reset by peer" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.381143 4851 generic.go:334] "Generic (PLEG): container finished" podID="1db63449-71cb-4fa2-86db-43a83a914643" containerID="1566f50774133ea95c6a0e9c324d2e5bcf12901648eb5a468022250e57e983e9" exitCode=0 Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.381324 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b666754b-b52ns" event={"ID":"1db63449-71cb-4fa2-86db-43a83a914643","Type":"ContainerDied","Data":"1566f50774133ea95c6a0e9c324d2e5bcf12901648eb5a468022250e57e983e9"} Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.398279 4851 generic.go:334] "Generic (PLEG): container finished" podID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerID="6a062c003a7de76b285243f92119681912715e0c2f6b637150600517b03f8579" exitCode=0 Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.398385 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55afd90b-7461-4db7-89a6-d45f9bdcb1b3","Type":"ContainerDied","Data":"6a062c003a7de76b285243f92119681912715e0c2f6b637150600517b03f8579"} Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.701482 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.748189 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.804966 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-config-data\") pod \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.805015 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc7p9\" (UniqueName: \"kubernetes.io/projected/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-kube-api-access-zc7p9\") pod \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.805051 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-sg-core-conf-yaml\") pod \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.805120 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-run-httpd\") pod \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.805155 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-scripts\") pod \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.805192 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-log-httpd\") pod \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.805251 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-combined-ca-bundle\") pod \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\" (UID: \"55afd90b-7461-4db7-89a6-d45f9bdcb1b3\") " Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.806080 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "55afd90b-7461-4db7-89a6-d45f9bdcb1b3" (UID: "55afd90b-7461-4db7-89a6-d45f9bdcb1b3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.809412 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "55afd90b-7461-4db7-89a6-d45f9bdcb1b3" (UID: "55afd90b-7461-4db7-89a6-d45f9bdcb1b3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.810852 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-kube-api-access-zc7p9" (OuterVolumeSpecName: "kube-api-access-zc7p9") pod "55afd90b-7461-4db7-89a6-d45f9bdcb1b3" (UID: "55afd90b-7461-4db7-89a6-d45f9bdcb1b3"). InnerVolumeSpecName "kube-api-access-zc7p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.812253 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-scripts" (OuterVolumeSpecName: "scripts") pod "55afd90b-7461-4db7-89a6-d45f9bdcb1b3" (UID: "55afd90b-7461-4db7-89a6-d45f9bdcb1b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.842534 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "55afd90b-7461-4db7-89a6-d45f9bdcb1b3" (UID: "55afd90b-7461-4db7-89a6-d45f9bdcb1b3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.876627 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.896296 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.905780 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55afd90b-7461-4db7-89a6-d45f9bdcb1b3" (UID: "55afd90b-7461-4db7-89a6-d45f9bdcb1b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.907173 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc7p9\" (UniqueName: \"kubernetes.io/projected/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-kube-api-access-zc7p9\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.907216 4851 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.907227 4851 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.907246 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.907258 4851 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.907269 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:43 crc kubenswrapper[4851]: I1001 13:13:43.949216 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-config-data" (OuterVolumeSpecName: "config-data") pod "55afd90b-7461-4db7-89a6-d45f9bdcb1b3" (UID: "55afd90b-7461-4db7-89a6-d45f9bdcb1b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.008687 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-config-data\") pod \"08f27829-7a4b-49d3-aed3-dbae56854228\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.009052 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-combined-ca-bundle\") pod \"08f27829-7a4b-49d3-aed3-dbae56854228\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.009156 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-scripts\") pod \"08f27829-7a4b-49d3-aed3-dbae56854228\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.009262 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08f27829-7a4b-49d3-aed3-dbae56854228-etc-machine-id\") pod \"08f27829-7a4b-49d3-aed3-dbae56854228\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.009484 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08f27829-7a4b-49d3-aed3-dbae56854228-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "08f27829-7a4b-49d3-aed3-dbae56854228" (UID: "08f27829-7a4b-49d3-aed3-dbae56854228"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.009700 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsbs8\" (UniqueName: \"kubernetes.io/projected/08f27829-7a4b-49d3-aed3-dbae56854228-kube-api-access-hsbs8\") pod \"08f27829-7a4b-49d3-aed3-dbae56854228\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.009825 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-db-sync-config-data\") pod \"08f27829-7a4b-49d3-aed3-dbae56854228\" (UID: \"08f27829-7a4b-49d3-aed3-dbae56854228\") " Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.010620 4851 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08f27829-7a4b-49d3-aed3-dbae56854228-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.010774 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55afd90b-7461-4db7-89a6-d45f9bdcb1b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.012624 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-scripts" (OuterVolumeSpecName: "scripts") pod "08f27829-7a4b-49d3-aed3-dbae56854228" (UID: "08f27829-7a4b-49d3-aed3-dbae56854228"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.013233 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "08f27829-7a4b-49d3-aed3-dbae56854228" (UID: "08f27829-7a4b-49d3-aed3-dbae56854228"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.013998 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f27829-7a4b-49d3-aed3-dbae56854228-kube-api-access-hsbs8" (OuterVolumeSpecName: "kube-api-access-hsbs8") pod "08f27829-7a4b-49d3-aed3-dbae56854228" (UID: "08f27829-7a4b-49d3-aed3-dbae56854228"). InnerVolumeSpecName "kube-api-access-hsbs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.033718 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08f27829-7a4b-49d3-aed3-dbae56854228" (UID: "08f27829-7a4b-49d3-aed3-dbae56854228"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.062343 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-config-data" (OuterVolumeSpecName: "config-data") pod "08f27829-7a4b-49d3-aed3-dbae56854228" (UID: "08f27829-7a4b-49d3-aed3-dbae56854228"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.112657 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.112698 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.112709 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.112717 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsbs8\" (UniqueName: \"kubernetes.io/projected/08f27829-7a4b-49d3-aed3-dbae56854228-kube-api-access-hsbs8\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.112729 4851 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08f27829-7a4b-49d3-aed3-dbae56854228-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.407111 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9zwxh" event={"ID":"08f27829-7a4b-49d3-aed3-dbae56854228","Type":"ContainerDied","Data":"9f6ad98752756c5455b537114f40e502ce4a1922ff7e5a3cce459c9490205b6f"} Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.407398 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f6ad98752756c5455b537114f40e502ce4a1922ff7e5a3cce459c9490205b6f" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.407445 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9zwxh" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.413178 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.413234 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55afd90b-7461-4db7-89a6-d45f9bdcb1b3","Type":"ContainerDied","Data":"c90495043976322ea2e0c36cea4c2c248b599763d47b7721f18e83c9e34f066a"} Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.413266 4851 scope.go:117] "RemoveContainer" containerID="e285c4a1b5e56583514f9abc7a427ec82264d1b7e158b4d15c5da89f915ff7ea" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.506765 4851 scope.go:117] "RemoveContainer" containerID="5c73849f60fb43fb34c8f621f09a4a436c73dd24d7472e776293c138336b62ea" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.516948 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.554430 4851 scope.go:117] "RemoveContainer" containerID="6a062c003a7de76b285243f92119681912715e0c2f6b637150600517b03f8579" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.558571 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.570439 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:13:44 crc kubenswrapper[4851]: E1001 13:13:44.570869 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="sg-core" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.570887 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="sg-core" Oct 01 13:13:44 crc kubenswrapper[4851]: E1001 13:13:44.570913 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="ceilometer-central-agent" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.570920 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="ceilometer-central-agent" Oct 01 13:13:44 crc kubenswrapper[4851]: E1001 13:13:44.570935 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b16af47-e71e-42f4-b45f-72afaaf0e13c" containerName="init" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.570941 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b16af47-e71e-42f4-b45f-72afaaf0e13c" containerName="init" Oct 01 13:13:44 crc kubenswrapper[4851]: E1001 13:13:44.570952 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="ceilometer-notification-agent" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.570957 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="ceilometer-notification-agent" Oct 01 13:13:44 crc kubenswrapper[4851]: E1001 13:13:44.570975 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="proxy-httpd" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.570980 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="proxy-httpd" Oct 01 13:13:44 crc kubenswrapper[4851]: E1001 13:13:44.570989 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b16af47-e71e-42f4-b45f-72afaaf0e13c" containerName="dnsmasq-dns" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.571007 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b16af47-e71e-42f4-b45f-72afaaf0e13c" containerName="dnsmasq-dns" Oct 01 13:13:44 crc kubenswrapper[4851]: E1001 13:13:44.571019 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f27829-7a4b-49d3-aed3-dbae56854228" containerName="cinder-db-sync" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.571025 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f27829-7a4b-49d3-aed3-dbae56854228" containerName="cinder-db-sync" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.571184 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="proxy-httpd" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.571200 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="ceilometer-central-agent" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.571212 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="sg-core" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.571220 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="08f27829-7a4b-49d3-aed3-dbae56854228" containerName="cinder-db-sync" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.571235 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" containerName="ceilometer-notification-agent" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.571246 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b16af47-e71e-42f4-b45f-72afaaf0e13c" containerName="dnsmasq-dns" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.575993 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.582406 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.582694 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.588010 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.625681 4851 scope.go:117] "RemoveContainer" containerID="11a1c08f2113a358b5d22ecdf09d9cd74ac93459e063a5735048c02487b5cb3d" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.710541 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d994585bc-s9wp6"] Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.711973 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.740201 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d994585bc-s9wp6"] Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.756007 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-config-data\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.756047 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5bj8\" (UniqueName: \"kubernetes.io/projected/973a48f4-ab11-4c1d-b358-36c986b44f8c-kube-api-access-p5bj8\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.756091 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-scripts\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.756134 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.756158 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/973a48f4-ab11-4c1d-b358-36c986b44f8c-run-httpd\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.756195 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/973a48f4-ab11-4c1d-b358-36c986b44f8c-log-httpd\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.756213 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.772620 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.774178 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.779922 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.780166 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ckz5d" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.780319 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.780381 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.797395 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.860726 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/973a48f4-ab11-4c1d-b358-36c986b44f8c-log-httpd\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.860775 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.860891 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-config-data\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.860920 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-dns-svc\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.860943 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5bj8\" (UniqueName: \"kubernetes.io/projected/973a48f4-ab11-4c1d-b358-36c986b44f8c-kube-api-access-p5bj8\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.860990 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx2zx\" (UniqueName: \"kubernetes.io/projected/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-kube-api-access-sx2zx\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.861053 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-scripts\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.861098 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-ovsdbserver-nb\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.861153 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.861173 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-ovsdbserver-sb\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.861205 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/973a48f4-ab11-4c1d-b358-36c986b44f8c-run-httpd\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.861226 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-config\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.861241 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/973a48f4-ab11-4c1d-b358-36c986b44f8c-log-httpd\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.861279 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-dns-swift-storage-0\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.863838 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/973a48f4-ab11-4c1d-b358-36c986b44f8c-run-httpd\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.867952 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-scripts\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.872214 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.872845 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-config-data\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.873424 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.879111 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.881749 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.887572 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.892729 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5bj8\" (UniqueName: \"kubernetes.io/projected/973a48f4-ab11-4c1d-b358-36c986b44f8c-kube-api-access-p5bj8\") pod \"ceilometer-0\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " pod="openstack/ceilometer-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.901826 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.962891 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.962972 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-dns-svc\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.963016 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-scripts\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.963047 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx2zx\" (UniqueName: \"kubernetes.io/projected/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-kube-api-access-sx2zx\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.963076 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-config-data\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.963134 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-ovsdbserver-nb\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.963181 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-ovsdbserver-sb\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.963215 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-config\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.963236 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpwt2\" (UniqueName: \"kubernetes.io/projected/1179c5d5-fcf4-4732-959f-852ba78d2829-kube-api-access-kpwt2\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.963283 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-dns-swift-storage-0\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.963324 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.963362 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1179c5d5-fcf4-4732-959f-852ba78d2829-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.963917 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-dns-svc\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.964264 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-ovsdbserver-sb\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.964862 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-dns-swift-storage-0\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.966205 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-ovsdbserver-nb\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.971248 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-config\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.978961 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx2zx\" (UniqueName: \"kubernetes.io/projected/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-kube-api-access-sx2zx\") pod \"dnsmasq-dns-d994585bc-s9wp6\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:44 crc kubenswrapper[4851]: I1001 13:13:44.987561 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.055964 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.069423 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-config-data\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.069514 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-scripts\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.069576 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcp47\" (UniqueName: \"kubernetes.io/projected/bb913847-e636-4756-9de2-f5c7b45c2c9f-kube-api-access-dcp47\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.069621 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-config-data-custom\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.069646 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb913847-e636-4756-9de2-f5c7b45c2c9f-logs\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.069739 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpwt2\" (UniqueName: \"kubernetes.io/projected/1179c5d5-fcf4-4732-959f-852ba78d2829-kube-api-access-kpwt2\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.069781 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb913847-e636-4756-9de2-f5c7b45c2c9f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.069869 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.069934 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.070181 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1179c5d5-fcf4-4732-959f-852ba78d2829-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.070204 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-config-data\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.070241 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.070345 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-scripts\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.071699 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1179c5d5-fcf4-4732-959f-852ba78d2829-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.076256 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.076289 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.076673 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-scripts\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.078424 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-config-data\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.093464 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpwt2\" (UniqueName: \"kubernetes.io/projected/1179c5d5-fcf4-4732-959f-852ba78d2829-kube-api-access-kpwt2\") pod \"cinder-scheduler-0\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.098895 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.172036 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb913847-e636-4756-9de2-f5c7b45c2c9f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.172098 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.172137 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-config-data\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.172198 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-scripts\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.172234 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcp47\" (UniqueName: \"kubernetes.io/projected/bb913847-e636-4756-9de2-f5c7b45c2c9f-kube-api-access-dcp47\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.172255 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-config-data-custom\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.172269 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb913847-e636-4756-9de2-f5c7b45c2c9f-logs\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.172651 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb913847-e636-4756-9de2-f5c7b45c2c9f-logs\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.172702 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb913847-e636-4756-9de2-f5c7b45c2c9f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.179052 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-config-data-custom\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.183595 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-config-data\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.184860 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-scripts\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.201801 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.210836 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcp47\" (UniqueName: \"kubernetes.io/projected/bb913847-e636-4756-9de2-f5c7b45c2c9f-kube-api-access-dcp47\") pod \"cinder-api-0\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.346078 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.364048 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.425944 4851 generic.go:334] "Generic (PLEG): container finished" podID="6f40427b-938d-4f47-a126-b05108b109a6" containerID="4cb73528187ef34285135d104b2ba7bbcfb39a5caac97fa3c33ea0f0ba1fcbfe" exitCode=0 Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.426022 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f9dc8947d-h5tw8" event={"ID":"6f40427b-938d-4f47-a126-b05108b109a6","Type":"ContainerDied","Data":"4cb73528187ef34285135d104b2ba7bbcfb39a5caac97fa3c33ea0f0ba1fcbfe"} Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.426065 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f9dc8947d-h5tw8" event={"ID":"6f40427b-938d-4f47-a126-b05108b109a6","Type":"ContainerDied","Data":"2287b7f9f413969d7ea20726208c432678c43877c69e20649bcd6bbbcc5f4341"} Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.426088 4851 scope.go:117] "RemoveContainer" containerID="4cb73528187ef34285135d104b2ba7bbcfb39a5caac97fa3c33ea0f0ba1fcbfe" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.426184 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f9dc8947d-h5tw8" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.462626 4851 scope.go:117] "RemoveContainer" containerID="1ac11f9ec49b84b6c3f28d475f746db4561b66e46bc0916c55d245282b87baba" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.477447 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg8q7\" (UniqueName: \"kubernetes.io/projected/6f40427b-938d-4f47-a126-b05108b109a6-kube-api-access-kg8q7\") pod \"6f40427b-938d-4f47-a126-b05108b109a6\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.477488 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f40427b-938d-4f47-a126-b05108b109a6-logs\") pod \"6f40427b-938d-4f47-a126-b05108b109a6\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.477632 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-config-data\") pod \"6f40427b-938d-4f47-a126-b05108b109a6\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.477647 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-combined-ca-bundle\") pod \"6f40427b-938d-4f47-a126-b05108b109a6\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.477680 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-config-data-custom\") pod \"6f40427b-938d-4f47-a126-b05108b109a6\" (UID: \"6f40427b-938d-4f47-a126-b05108b109a6\") " Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.479219 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f40427b-938d-4f47-a126-b05108b109a6-logs" (OuterVolumeSpecName: "logs") pod "6f40427b-938d-4f47-a126-b05108b109a6" (UID: "6f40427b-938d-4f47-a126-b05108b109a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.483847 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f40427b-938d-4f47-a126-b05108b109a6-kube-api-access-kg8q7" (OuterVolumeSpecName: "kube-api-access-kg8q7") pod "6f40427b-938d-4f47-a126-b05108b109a6" (UID: "6f40427b-938d-4f47-a126-b05108b109a6"). InnerVolumeSpecName "kube-api-access-kg8q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.486578 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6f40427b-938d-4f47-a126-b05108b109a6" (UID: "6f40427b-938d-4f47-a126-b05108b109a6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.522544 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f40427b-938d-4f47-a126-b05108b109a6" (UID: "6f40427b-938d-4f47-a126-b05108b109a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.526462 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.530915 4851 scope.go:117] "RemoveContainer" containerID="4cb73528187ef34285135d104b2ba7bbcfb39a5caac97fa3c33ea0f0ba1fcbfe" Oct 01 13:13:45 crc kubenswrapper[4851]: E1001 13:13:45.539594 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb73528187ef34285135d104b2ba7bbcfb39a5caac97fa3c33ea0f0ba1fcbfe\": container with ID starting with 4cb73528187ef34285135d104b2ba7bbcfb39a5caac97fa3c33ea0f0ba1fcbfe not found: ID does not exist" containerID="4cb73528187ef34285135d104b2ba7bbcfb39a5caac97fa3c33ea0f0ba1fcbfe" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.539669 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb73528187ef34285135d104b2ba7bbcfb39a5caac97fa3c33ea0f0ba1fcbfe"} err="failed to get container status \"4cb73528187ef34285135d104b2ba7bbcfb39a5caac97fa3c33ea0f0ba1fcbfe\": rpc error: code = NotFound desc = could not find container \"4cb73528187ef34285135d104b2ba7bbcfb39a5caac97fa3c33ea0f0ba1fcbfe\": container with ID starting with 4cb73528187ef34285135d104b2ba7bbcfb39a5caac97fa3c33ea0f0ba1fcbfe not found: ID does not exist" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.539706 4851 scope.go:117] "RemoveContainer" containerID="1ac11f9ec49b84b6c3f28d475f746db4561b66e46bc0916c55d245282b87baba" Oct 01 13:13:45 crc kubenswrapper[4851]: E1001 13:13:45.544661 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac11f9ec49b84b6c3f28d475f746db4561b66e46bc0916c55d245282b87baba\": container with ID starting with 1ac11f9ec49b84b6c3f28d475f746db4561b66e46bc0916c55d245282b87baba not found: ID does not exist" containerID="1ac11f9ec49b84b6c3f28d475f746db4561b66e46bc0916c55d245282b87baba" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.544714 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac11f9ec49b84b6c3f28d475f746db4561b66e46bc0916c55d245282b87baba"} err="failed to get container status \"1ac11f9ec49b84b6c3f28d475f746db4561b66e46bc0916c55d245282b87baba\": rpc error: code = NotFound desc = could not find container \"1ac11f9ec49b84b6c3f28d475f746db4561b66e46bc0916c55d245282b87baba\": container with ID starting with 1ac11f9ec49b84b6c3f28d475f746db4561b66e46bc0916c55d245282b87baba not found: ID does not exist" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.569736 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-config-data" (OuterVolumeSpecName: "config-data") pod "6f40427b-938d-4f47-a126-b05108b109a6" (UID: "6f40427b-938d-4f47-a126-b05108b109a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.579798 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg8q7\" (UniqueName: \"kubernetes.io/projected/6f40427b-938d-4f47-a126-b05108b109a6-kube-api-access-kg8q7\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.579834 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f40427b-938d-4f47-a126-b05108b109a6-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.579845 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.579854 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.579862 4851 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f40427b-938d-4f47-a126-b05108b109a6-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.800930 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d994585bc-s9wp6"] Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.806868 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:13:45 crc kubenswrapper[4851]: W1001 13:13:45.813219 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1baf8f0_b24f_4dfd_a6fb_3e4e578c11f4.slice/crio-b1387381ff0097225ac92379dd0395df3c5847e598dd11e75ec42f32f6066f51 WatchSource:0}: Error finding container b1387381ff0097225ac92379dd0395df3c5847e598dd11e75ec42f32f6066f51: Status 404 returned error can't find the container with id b1387381ff0097225ac92379dd0395df3c5847e598dd11e75ec42f32f6066f51 Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.852627 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-f9dc8947d-h5tw8"] Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.862477 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-f9dc8947d-h5tw8"] Oct 01 13:13:45 crc kubenswrapper[4851]: I1001 13:13:45.868055 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67b666754b-b52ns" podUID="1db63449-71cb-4fa2-86db-43a83a914643" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.160:8443: connect: connection refused" Oct 01 13:13:46 crc kubenswrapper[4851]: I1001 13:13:46.020126 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:13:46 crc kubenswrapper[4851]: I1001 13:13:46.355642 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55afd90b-7461-4db7-89a6-d45f9bdcb1b3" path="/var/lib/kubelet/pods/55afd90b-7461-4db7-89a6-d45f9bdcb1b3/volumes" Oct 01 13:13:46 crc kubenswrapper[4851]: I1001 13:13:46.357008 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f40427b-938d-4f47-a126-b05108b109a6" path="/var/lib/kubelet/pods/6f40427b-938d-4f47-a126-b05108b109a6/volumes" Oct 01 13:13:46 crc kubenswrapper[4851]: I1001 13:13:46.450531 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1179c5d5-fcf4-4732-959f-852ba78d2829","Type":"ContainerStarted","Data":"6575908806aff04738d95c27ec0663f60f81f0342cabc66d6f8b88d3aa205151"} Oct 01 13:13:46 crc kubenswrapper[4851]: I1001 13:13:46.456709 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bb913847-e636-4756-9de2-f5c7b45c2c9f","Type":"ContainerStarted","Data":"2ed1d82d62cf782063677c0de48e1bf590cfcab8a24bca312878bac47fd2037e"} Oct 01 13:13:46 crc kubenswrapper[4851]: I1001 13:13:46.468042 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"973a48f4-ab11-4c1d-b358-36c986b44f8c","Type":"ContainerStarted","Data":"9c11cab70fb348044092c4a0b61186fefb99429ff5a33d4a419303a0d7ad7432"} Oct 01 13:13:46 crc kubenswrapper[4851]: I1001 13:13:46.468081 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"973a48f4-ab11-4c1d-b358-36c986b44f8c","Type":"ContainerStarted","Data":"7547d91f8a340e368c10f0d6466101cc3c9b26345a464fe5c003cd886e0b60cf"} Oct 01 13:13:46 crc kubenswrapper[4851]: I1001 13:13:46.470599 4851 generic.go:334] "Generic (PLEG): container finished" podID="f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" containerID="730a3d01e83d70037e0d1e6eec7d2255a45f99e3144d55b4d3bc2d7366216b4f" exitCode=0 Oct 01 13:13:46 crc kubenswrapper[4851]: I1001 13:13:46.470643 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d994585bc-s9wp6" event={"ID":"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4","Type":"ContainerDied","Data":"730a3d01e83d70037e0d1e6eec7d2255a45f99e3144d55b4d3bc2d7366216b4f"} Oct 01 13:13:46 crc kubenswrapper[4851]: I1001 13:13:46.470662 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d994585bc-s9wp6" event={"ID":"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4","Type":"ContainerStarted","Data":"b1387381ff0097225ac92379dd0395df3c5847e598dd11e75ec42f32f6066f51"} Oct 01 13:13:46 crc kubenswrapper[4851]: I1001 13:13:46.921488 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:13:47 crc kubenswrapper[4851]: I1001 13:13:47.503829 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"973a48f4-ab11-4c1d-b358-36c986b44f8c","Type":"ContainerStarted","Data":"454cdbc95cfa935e30954f494ccea16c320b736804108fbf89fe7f53ac0430ad"} Oct 01 13:13:47 crc kubenswrapper[4851]: I1001 13:13:47.504111 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"973a48f4-ab11-4c1d-b358-36c986b44f8c","Type":"ContainerStarted","Data":"31cb5f3156b36394319f6015cd6472af26f2f82f94a3c2720b9a8746f95387dc"} Oct 01 13:13:47 crc kubenswrapper[4851]: I1001 13:13:47.508202 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d994585bc-s9wp6" event={"ID":"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4","Type":"ContainerStarted","Data":"7de8ca1812be5087708ef2bafde78ee8e14f485f6331c0772023e1d7df6836c4"} Oct 01 13:13:47 crc kubenswrapper[4851]: I1001 13:13:47.509530 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:47 crc kubenswrapper[4851]: I1001 13:13:47.515517 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1179c5d5-fcf4-4732-959f-852ba78d2829","Type":"ContainerStarted","Data":"8b3ed9c559569c38c9c5c3d75695afc9377fbc41354cada12bfd6710412aac8b"} Oct 01 13:13:47 crc kubenswrapper[4851]: I1001 13:13:47.520966 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bb913847-e636-4756-9de2-f5c7b45c2c9f","Type":"ContainerStarted","Data":"a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb"} Oct 01 13:13:47 crc kubenswrapper[4851]: I1001 13:13:47.532016 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d994585bc-s9wp6" podStartSLOduration=3.53198048 podStartE2EDuration="3.53198048s" podCreationTimestamp="2025-10-01 13:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:47.527271185 +0000 UTC m=+1235.872388671" watchObservedRunningTime="2025-10-01 13:13:47.53198048 +0000 UTC m=+1235.877097966" Oct 01 13:13:48 crc kubenswrapper[4851]: I1001 13:13:48.530079 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1179c5d5-fcf4-4732-959f-852ba78d2829","Type":"ContainerStarted","Data":"0e6c499764f6c34718bc6b514bd73d540b331e4dcd4d95de4bcd842bbee792d2"} Oct 01 13:13:48 crc kubenswrapper[4851]: I1001 13:13:48.534080 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bb913847-e636-4756-9de2-f5c7b45c2c9f","Type":"ContainerStarted","Data":"2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1"} Oct 01 13:13:48 crc kubenswrapper[4851]: I1001 13:13:48.534245 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bb913847-e636-4756-9de2-f5c7b45c2c9f" containerName="cinder-api" containerID="cri-o://2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1" gracePeriod=30 Oct 01 13:13:48 crc kubenswrapper[4851]: I1001 13:13:48.534161 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bb913847-e636-4756-9de2-f5c7b45c2c9f" containerName="cinder-api-log" containerID="cri-o://a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb" gracePeriod=30 Oct 01 13:13:48 crc kubenswrapper[4851]: I1001 13:13:48.534396 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 13:13:48 crc kubenswrapper[4851]: I1001 13:13:48.558307 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.301828295 podStartE2EDuration="4.558291186s" podCreationTimestamp="2025-10-01 13:13:44 +0000 UTC" firstStartedPulling="2025-10-01 13:13:45.827665629 +0000 UTC m=+1234.172783115" lastFinishedPulling="2025-10-01 13:13:46.08412851 +0000 UTC m=+1234.429246006" observedRunningTime="2025-10-01 13:13:48.550791542 +0000 UTC m=+1236.895909098" watchObservedRunningTime="2025-10-01 13:13:48.558291186 +0000 UTC m=+1236.903408672" Oct 01 13:13:48 crc kubenswrapper[4851]: I1001 13:13:48.578628 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.578611556 podStartE2EDuration="4.578611556s" podCreationTimestamp="2025-10-01 13:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:48.576786144 +0000 UTC m=+1236.921903640" watchObservedRunningTime="2025-10-01 13:13:48.578611556 +0000 UTC m=+1236.923729032" Oct 01 13:13:48 crc kubenswrapper[4851]: I1001 13:13:48.896298 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 01 13:13:48 crc kubenswrapper[4851]: I1001 13:13:48.914837 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.079975 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.146353 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb913847-e636-4756-9de2-f5c7b45c2c9f-logs\") pod \"bb913847-e636-4756-9de2-f5c7b45c2c9f\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.146629 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-config-data-custom\") pod \"bb913847-e636-4756-9de2-f5c7b45c2c9f\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.146710 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-config-data\") pod \"bb913847-e636-4756-9de2-f5c7b45c2c9f\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.146885 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb913847-e636-4756-9de2-f5c7b45c2c9f-logs" (OuterVolumeSpecName: "logs") pod "bb913847-e636-4756-9de2-f5c7b45c2c9f" (UID: "bb913847-e636-4756-9de2-f5c7b45c2c9f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.147465 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-scripts\") pod \"bb913847-e636-4756-9de2-f5c7b45c2c9f\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.147597 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-combined-ca-bundle\") pod \"bb913847-e636-4756-9de2-f5c7b45c2c9f\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.147780 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcp47\" (UniqueName: \"kubernetes.io/projected/bb913847-e636-4756-9de2-f5c7b45c2c9f-kube-api-access-dcp47\") pod \"bb913847-e636-4756-9de2-f5c7b45c2c9f\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.147871 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb913847-e636-4756-9de2-f5c7b45c2c9f-etc-machine-id\") pod \"bb913847-e636-4756-9de2-f5c7b45c2c9f\" (UID: \"bb913847-e636-4756-9de2-f5c7b45c2c9f\") " Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.148085 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb913847-e636-4756-9de2-f5c7b45c2c9f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bb913847-e636-4756-9de2-f5c7b45c2c9f" (UID: "bb913847-e636-4756-9de2-f5c7b45c2c9f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.148535 4851 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb913847-e636-4756-9de2-f5c7b45c2c9f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.148578 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb913847-e636-4756-9de2-f5c7b45c2c9f-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.153630 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb913847-e636-4756-9de2-f5c7b45c2c9f-kube-api-access-dcp47" (OuterVolumeSpecName: "kube-api-access-dcp47") pod "bb913847-e636-4756-9de2-f5c7b45c2c9f" (UID: "bb913847-e636-4756-9de2-f5c7b45c2c9f"). InnerVolumeSpecName "kube-api-access-dcp47". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.153688 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bb913847-e636-4756-9de2-f5c7b45c2c9f" (UID: "bb913847-e636-4756-9de2-f5c7b45c2c9f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.160366 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-scripts" (OuterVolumeSpecName: "scripts") pod "bb913847-e636-4756-9de2-f5c7b45c2c9f" (UID: "bb913847-e636-4756-9de2-f5c7b45c2c9f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.176707 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb913847-e636-4756-9de2-f5c7b45c2c9f" (UID: "bb913847-e636-4756-9de2-f5c7b45c2c9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.204513 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-config-data" (OuterVolumeSpecName: "config-data") pod "bb913847-e636-4756-9de2-f5c7b45c2c9f" (UID: "bb913847-e636-4756-9de2-f5c7b45c2c9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.251593 4851 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.251618 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.251627 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.251636 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb913847-e636-4756-9de2-f5c7b45c2c9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.251646 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcp47\" (UniqueName: \"kubernetes.io/projected/bb913847-e636-4756-9de2-f5c7b45c2c9f-kube-api-access-dcp47\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.545926 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"973a48f4-ab11-4c1d-b358-36c986b44f8c","Type":"ContainerStarted","Data":"1ed97d7f4b48d3b9b7a83d8c46053ac262a4889712ef898ef29dd2503552c0bb"} Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.546214 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.549413 4851 generic.go:334] "Generic (PLEG): container finished" podID="bb913847-e636-4756-9de2-f5c7b45c2c9f" containerID="2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1" exitCode=0 Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.549440 4851 generic.go:334] "Generic (PLEG): container finished" podID="bb913847-e636-4756-9de2-f5c7b45c2c9f" containerID="a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb" exitCode=143 Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.550049 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.556794 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bb913847-e636-4756-9de2-f5c7b45c2c9f","Type":"ContainerDied","Data":"2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1"} Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.556842 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bb913847-e636-4756-9de2-f5c7b45c2c9f","Type":"ContainerDied","Data":"a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb"} Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.556854 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bb913847-e636-4756-9de2-f5c7b45c2c9f","Type":"ContainerDied","Data":"2ed1d82d62cf782063677c0de48e1bf590cfcab8a24bca312878bac47fd2037e"} Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.556873 4851 scope.go:117] "RemoveContainer" containerID="2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.565776 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.581162 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.958609056 podStartE2EDuration="5.581146563s" podCreationTimestamp="2025-10-01 13:13:44 +0000 UTC" firstStartedPulling="2025-10-01 13:13:45.536349363 +0000 UTC m=+1233.881466849" lastFinishedPulling="2025-10-01 13:13:49.15888688 +0000 UTC m=+1237.504004356" observedRunningTime="2025-10-01 13:13:49.579403553 +0000 UTC m=+1237.924521049" watchObservedRunningTime="2025-10-01 13:13:49.581146563 +0000 UTC m=+1237.926264049" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.596124 4851 scope.go:117] "RemoveContainer" containerID="a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.625395 4851 scope.go:117] "RemoveContainer" containerID="2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.633517 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.641261 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:13:49 crc kubenswrapper[4851]: E1001 13:13:49.642133 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1\": container with ID starting with 2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1 not found: ID does not exist" containerID="2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.642221 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1"} err="failed to get container status \"2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1\": rpc error: code = NotFound desc = could not find container \"2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1\": container with ID starting with 2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1 not found: ID does not exist" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.642276 4851 scope.go:117] "RemoveContainer" containerID="a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb" Oct 01 13:13:49 crc kubenswrapper[4851]: E1001 13:13:49.655957 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb\": container with ID starting with a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb not found: ID does not exist" containerID="a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.656001 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb"} err="failed to get container status \"a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb\": rpc error: code = NotFound desc = could not find container \"a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb\": container with ID starting with a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb not found: ID does not exist" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.656031 4851 scope.go:117] "RemoveContainer" containerID="2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.657963 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1"} err="failed to get container status \"2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1\": rpc error: code = NotFound desc = could not find container \"2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1\": container with ID starting with 2c75d2374fc539dfbef6896dce3f962ae7ddfc45a710add2542ec36c10c087b1 not found: ID does not exist" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.658016 4851 scope.go:117] "RemoveContainer" containerID="a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.658329 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb"} err="failed to get container status \"a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb\": rpc error: code = NotFound desc = could not find container \"a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb\": container with ID starting with a8bd66a0f7cde548c3f72f01cde8f0f0a77731baefd519dd0f97b4cfb649cbdb not found: ID does not exist" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.663657 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:13:49 crc kubenswrapper[4851]: E1001 13:13:49.664245 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f40427b-938d-4f47-a126-b05108b109a6" containerName="barbican-api" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.664263 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f40427b-938d-4f47-a126-b05108b109a6" containerName="barbican-api" Oct 01 13:13:49 crc kubenswrapper[4851]: E1001 13:13:49.664276 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb913847-e636-4756-9de2-f5c7b45c2c9f" containerName="cinder-api-log" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.664284 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb913847-e636-4756-9de2-f5c7b45c2c9f" containerName="cinder-api-log" Oct 01 13:13:49 crc kubenswrapper[4851]: E1001 13:13:49.664305 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb913847-e636-4756-9de2-f5c7b45c2c9f" containerName="cinder-api" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.664315 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb913847-e636-4756-9de2-f5c7b45c2c9f" containerName="cinder-api" Oct 01 13:13:49 crc kubenswrapper[4851]: E1001 13:13:49.664341 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f40427b-938d-4f47-a126-b05108b109a6" containerName="barbican-api-log" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.664350 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f40427b-938d-4f47-a126-b05108b109a6" containerName="barbican-api-log" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.664600 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f40427b-938d-4f47-a126-b05108b109a6" containerName="barbican-api" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.664620 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb913847-e636-4756-9de2-f5c7b45c2c9f" containerName="cinder-api-log" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.664633 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f40427b-938d-4f47-a126-b05108b109a6" containerName="barbican-api-log" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.664666 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb913847-e636-4756-9de2-f5c7b45c2c9f" containerName="cinder-api" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.666068 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.674098 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.675323 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.675364 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.681150 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.760819 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-config-data-custom\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.760877 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwhb6\" (UniqueName: \"kubernetes.io/projected/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-kube-api-access-nwhb6\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.761003 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.761123 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-logs\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.761270 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.761294 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.761322 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-config-data\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.761340 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.761371 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-scripts\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.872744 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.872808 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-logs\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.872865 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.872882 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.872906 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-config-data\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.872926 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.872952 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-scripts\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.872998 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-config-data-custom\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.873022 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwhb6\" (UniqueName: \"kubernetes.io/projected/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-kube-api-access-nwhb6\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.873604 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.873935 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-logs\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.880966 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.885512 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.886298 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.889589 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-scripts\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.891367 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-config-data-custom\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.894981 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-config-data\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:49 crc kubenswrapper[4851]: I1001 13:13:49.898299 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwhb6\" (UniqueName: \"kubernetes.io/projected/698a82d3-c7a7-4b4b-8cf9-46f6589744d9-kube-api-access-nwhb6\") pod \"cinder-api-0\" (UID: \"698a82d3-c7a7-4b4b-8cf9-46f6589744d9\") " pod="openstack/cinder-api-0" Oct 01 13:13:50 crc kubenswrapper[4851]: I1001 13:13:50.038131 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 13:13:50 crc kubenswrapper[4851]: I1001 13:13:50.099878 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 13:13:50 crc kubenswrapper[4851]: I1001 13:13:50.328533 4851 scope.go:117] "RemoveContainer" containerID="e24b7de9a6b59f816add29e99ed4c97b0171d8f00ee2b1591eb4620a52cff2dd" Oct 01 13:13:50 crc kubenswrapper[4851]: I1001 13:13:50.359402 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb913847-e636-4756-9de2-f5c7b45c2c9f" path="/var/lib/kubelet/pods/bb913847-e636-4756-9de2-f5c7b45c2c9f/volumes" Oct 01 13:13:50 crc kubenswrapper[4851]: I1001 13:13:50.536413 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:13:50 crc kubenswrapper[4851]: W1001 13:13:50.547018 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod698a82d3_c7a7_4b4b_8cf9_46f6589744d9.slice/crio-806ee56a5eb31a1573ab5479219ec5a79ceb9370c04b73a396164c11c0d3fbc2 WatchSource:0}: Error finding container 806ee56a5eb31a1573ab5479219ec5a79ceb9370c04b73a396164c11c0d3fbc2: Status 404 returned error can't find the container with id 806ee56a5eb31a1573ab5479219ec5a79ceb9370c04b73a396164c11c0d3fbc2 Oct 01 13:13:50 crc kubenswrapper[4851]: I1001 13:13:50.567869 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"698a82d3-c7a7-4b4b-8cf9-46f6589744d9","Type":"ContainerStarted","Data":"806ee56a5eb31a1573ab5479219ec5a79ceb9370c04b73a396164c11c0d3fbc2"} Oct 01 13:13:50 crc kubenswrapper[4851]: I1001 13:13:50.569931 4851 generic.go:334] "Generic (PLEG): container finished" podID="bfc29451-e27b-4bd0-9a5f-6a177e7621be" containerID="b34c5f5f25ad1f532fa333e66fa3f7f64c12594ef6c6d5ee4b481d76a2a5fbbf" exitCode=0 Oct 01 13:13:50 crc kubenswrapper[4851]: I1001 13:13:50.569966 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fvbs7" event={"ID":"bfc29451-e27b-4bd0-9a5f-6a177e7621be","Type":"ContainerDied","Data":"b34c5f5f25ad1f532fa333e66fa3f7f64c12594ef6c6d5ee4b481d76a2a5fbbf"} Oct 01 13:13:50 crc kubenswrapper[4851]: I1001 13:13:50.575155 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"971ca0ac-6de7-42f1-bf29-5174fd80ced4","Type":"ContainerStarted","Data":"c20a7b78f4412c0f21c222ebc0f47ef6d52387a25f64c9b0748dd8ea538d34b0"} Oct 01 13:13:51 crc kubenswrapper[4851]: I1001 13:13:51.598171 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"698a82d3-c7a7-4b4b-8cf9-46f6589744d9","Type":"ContainerStarted","Data":"4ee10cf5a4bd206a42cf9ccf3ef879a2814c45ecb4beef117b3e1051c0495380"} Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.047818 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fvbs7" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.127724 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc29451-e27b-4bd0-9a5f-6a177e7621be-combined-ca-bundle\") pod \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\" (UID: \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\") " Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.127763 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bfc29451-e27b-4bd0-9a5f-6a177e7621be-config\") pod \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\" (UID: \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\") " Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.127832 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll4rb\" (UniqueName: \"kubernetes.io/projected/bfc29451-e27b-4bd0-9a5f-6a177e7621be-kube-api-access-ll4rb\") pod \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\" (UID: \"bfc29451-e27b-4bd0-9a5f-6a177e7621be\") " Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.132129 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfc29451-e27b-4bd0-9a5f-6a177e7621be-kube-api-access-ll4rb" (OuterVolumeSpecName: "kube-api-access-ll4rb") pod "bfc29451-e27b-4bd0-9a5f-6a177e7621be" (UID: "bfc29451-e27b-4bd0-9a5f-6a177e7621be"). InnerVolumeSpecName "kube-api-access-ll4rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.157358 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfc29451-e27b-4bd0-9a5f-6a177e7621be-config" (OuterVolumeSpecName: "config") pod "bfc29451-e27b-4bd0-9a5f-6a177e7621be" (UID: "bfc29451-e27b-4bd0-9a5f-6a177e7621be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.174704 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfc29451-e27b-4bd0-9a5f-6a177e7621be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfc29451-e27b-4bd0-9a5f-6a177e7621be" (UID: "bfc29451-e27b-4bd0-9a5f-6a177e7621be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.230341 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc29451-e27b-4bd0-9a5f-6a177e7621be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.230390 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bfc29451-e27b-4bd0-9a5f-6a177e7621be-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.230403 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll4rb\" (UniqueName: \"kubernetes.io/projected/bfc29451-e27b-4bd0-9a5f-6a177e7621be-kube-api-access-ll4rb\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.626874 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fvbs7" event={"ID":"bfc29451-e27b-4bd0-9a5f-6a177e7621be","Type":"ContainerDied","Data":"ba1a72275b809385d92871dc9d3c962a81158ed9e9aeb716e8a510bb35196b7f"} Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.627188 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba1a72275b809385d92871dc9d3c962a81158ed9e9aeb716e8a510bb35196b7f" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.627247 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fvbs7" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.638897 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"698a82d3-c7a7-4b4b-8cf9-46f6589744d9","Type":"ContainerStarted","Data":"6375ded5bf42939adf16c7aebd39076506b382349bd4cfbb663d82d33567ab19"} Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.639327 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.681545 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.681493953 podStartE2EDuration="3.681493953s" podCreationTimestamp="2025-10-01 13:13:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:52.656110408 +0000 UTC m=+1241.001227914" watchObservedRunningTime="2025-10-01 13:13:52.681493953 +0000 UTC m=+1241.026611449" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.863200 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d994585bc-s9wp6"] Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.863417 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d994585bc-s9wp6" podUID="f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" containerName="dnsmasq-dns" containerID="cri-o://7de8ca1812be5087708ef2bafde78ee8e14f485f6331c0772023e1d7df6836c4" gracePeriod=10 Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.864668 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.986820 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-788dbb8495-dp5gr"] Oct 01 13:13:52 crc kubenswrapper[4851]: E1001 13:13:52.987701 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc29451-e27b-4bd0-9a5f-6a177e7621be" containerName="neutron-db-sync" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.987717 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc29451-e27b-4bd0-9a5f-6a177e7621be" containerName="neutron-db-sync" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.988058 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfc29451-e27b-4bd0-9a5f-6a177e7621be" containerName="neutron-db-sync" Oct 01 13:13:52 crc kubenswrapper[4851]: I1001 13:13:52.989732 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.029301 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-788dbb8495-dp5gr"] Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.042219 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6856b75994-rphb7"] Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.043831 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.046042 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sq998" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.047180 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.047414 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.047650 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.051087 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-config\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.051140 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-ovsdbserver-nb\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.051183 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-ovsdbserver-sb\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.051216 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-dns-svc\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.051232 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-dns-swift-storage-0\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.051283 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxh7v\" (UniqueName: \"kubernetes.io/projected/a82d9b2c-d132-4496-92f6-9fea17968b18-kube-api-access-mxh7v\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.052272 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6856b75994-rphb7"] Oct 01 13:13:53 crc kubenswrapper[4851]: E1001 13:13:53.093411 4851 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1baf8f0_b24f_4dfd_a6fb_3e4e578c11f4.slice/crio-7de8ca1812be5087708ef2bafde78ee8e14f485f6331c0772023e1d7df6836c4.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.154070 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-config\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.154288 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-ovsdbserver-nb\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.154337 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-ovsdbserver-sb\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.154394 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-dns-svc\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.154420 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-dns-swift-storage-0\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.154481 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-httpd-config\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.154531 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-ovndb-tls-certs\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.154558 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-combined-ca-bundle\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.154581 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-config\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.154600 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5skp\" (UniqueName: \"kubernetes.io/projected/ff9726c5-9ce7-444b-9dfd-9b014835c375-kube-api-access-t5skp\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.154618 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxh7v\" (UniqueName: \"kubernetes.io/projected/a82d9b2c-d132-4496-92f6-9fea17968b18-kube-api-access-mxh7v\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.155203 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-config\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.155947 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-ovsdbserver-nb\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.156579 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-dns-svc\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.156741 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-ovsdbserver-sb\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.164758 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-dns-swift-storage-0\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.192013 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxh7v\" (UniqueName: \"kubernetes.io/projected/a82d9b2c-d132-4496-92f6-9fea17968b18-kube-api-access-mxh7v\") pod \"dnsmasq-dns-788dbb8495-dp5gr\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.256156 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-httpd-config\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.256191 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-ovndb-tls-certs\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.256214 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-combined-ca-bundle\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.256236 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-config\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.256258 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5skp\" (UniqueName: \"kubernetes.io/projected/ff9726c5-9ce7-444b-9dfd-9b014835c375-kube-api-access-t5skp\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.271251 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-ovndb-tls-certs\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.271612 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-httpd-config\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.271786 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-config\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.301669 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-combined-ca-bundle\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.305202 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5skp\" (UniqueName: \"kubernetes.io/projected/ff9726c5-9ce7-444b-9dfd-9b014835c375-kube-api-access-t5skp\") pod \"neutron-6856b75994-rphb7\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.403737 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.425713 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.464093 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.564160 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-config\") pod \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.564305 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-dns-swift-storage-0\") pod \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.564371 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-ovsdbserver-nb\") pod \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.564470 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-ovsdbserver-sb\") pod \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.564603 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx2zx\" (UniqueName: \"kubernetes.io/projected/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-kube-api-access-sx2zx\") pod \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.564658 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-dns-svc\") pod \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.588707 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-kube-api-access-sx2zx" (OuterVolumeSpecName: "kube-api-access-sx2zx") pod "f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" (UID: "f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4"). InnerVolumeSpecName "kube-api-access-sx2zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.638670 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" (UID: "f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.645913 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-78d78d4545-tv25n" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.646300 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-config" (OuterVolumeSpecName: "config") pod "f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" (UID: "f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.657158 4851 generic.go:334] "Generic (PLEG): container finished" podID="f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" containerID="7de8ca1812be5087708ef2bafde78ee8e14f485f6331c0772023e1d7df6836c4" exitCode=0 Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.657295 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d994585bc-s9wp6" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.658231 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d994585bc-s9wp6" event={"ID":"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4","Type":"ContainerDied","Data":"7de8ca1812be5087708ef2bafde78ee8e14f485f6331c0772023e1d7df6836c4"} Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.658533 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d994585bc-s9wp6" event={"ID":"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4","Type":"ContainerDied","Data":"b1387381ff0097225ac92379dd0395df3c5847e598dd11e75ec42f32f6066f51"} Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.658562 4851 scope.go:117] "RemoveContainer" containerID="7de8ca1812be5087708ef2bafde78ee8e14f485f6331c0772023e1d7df6836c4" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.664220 4851 generic.go:334] "Generic (PLEG): container finished" podID="971ca0ac-6de7-42f1-bf29-5174fd80ced4" containerID="c20a7b78f4412c0f21c222ebc0f47ef6d52387a25f64c9b0748dd8ea538d34b0" exitCode=1 Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.664883 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"971ca0ac-6de7-42f1-bf29-5174fd80ced4","Type":"ContainerDied","Data":"c20a7b78f4412c0f21c222ebc0f47ef6d52387a25f64c9b0748dd8ea538d34b0"} Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.665749 4851 scope.go:117] "RemoveContainer" containerID="c20a7b78f4412c0f21c222ebc0f47ef6d52387a25f64c9b0748dd8ea538d34b0" Oct 01 13:13:53 crc kubenswrapper[4851]: E1001 13:13:53.666107 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(971ca0ac-6de7-42f1-bf29-5174fd80ced4)\"" pod="openstack/watcher-decision-engine-0" podUID="971ca0ac-6de7-42f1-bf29-5174fd80ced4" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.671481 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" (UID: "f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.674741 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-dns-svc\") pod \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\" (UID: \"f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4\") " Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.675431 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.675450 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx2zx\" (UniqueName: \"kubernetes.io/projected/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-kube-api-access-sx2zx\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.677390 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:53 crc kubenswrapper[4851]: W1001 13:13:53.677711 4851 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4/volumes/kubernetes.io~configmap/dns-svc Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.677730 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" (UID: "f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.688197 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" (UID: "f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.701591 4851 scope.go:117] "RemoveContainer" containerID="730a3d01e83d70037e0d1e6eec7d2255a45f99e3144d55b4d3bc2d7366216b4f" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.719097 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" (UID: "f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.731150 4851 scope.go:117] "RemoveContainer" containerID="7de8ca1812be5087708ef2bafde78ee8e14f485f6331c0772023e1d7df6836c4" Oct 01 13:13:53 crc kubenswrapper[4851]: E1001 13:13:53.731806 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7de8ca1812be5087708ef2bafde78ee8e14f485f6331c0772023e1d7df6836c4\": container with ID starting with 7de8ca1812be5087708ef2bafde78ee8e14f485f6331c0772023e1d7df6836c4 not found: ID does not exist" containerID="7de8ca1812be5087708ef2bafde78ee8e14f485f6331c0772023e1d7df6836c4" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.731840 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7de8ca1812be5087708ef2bafde78ee8e14f485f6331c0772023e1d7df6836c4"} err="failed to get container status \"7de8ca1812be5087708ef2bafde78ee8e14f485f6331c0772023e1d7df6836c4\": rpc error: code = NotFound desc = could not find container \"7de8ca1812be5087708ef2bafde78ee8e14f485f6331c0772023e1d7df6836c4\": container with ID starting with 7de8ca1812be5087708ef2bafde78ee8e14f485f6331c0772023e1d7df6836c4 not found: ID does not exist" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.732650 4851 scope.go:117] "RemoveContainer" containerID="730a3d01e83d70037e0d1e6eec7d2255a45f99e3144d55b4d3bc2d7366216b4f" Oct 01 13:13:53 crc kubenswrapper[4851]: E1001 13:13:53.733127 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"730a3d01e83d70037e0d1e6eec7d2255a45f99e3144d55b4d3bc2d7366216b4f\": container with ID starting with 730a3d01e83d70037e0d1e6eec7d2255a45f99e3144d55b4d3bc2d7366216b4f not found: ID does not exist" containerID="730a3d01e83d70037e0d1e6eec7d2255a45f99e3144d55b4d3bc2d7366216b4f" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.733158 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"730a3d01e83d70037e0d1e6eec7d2255a45f99e3144d55b4d3bc2d7366216b4f"} err="failed to get container status \"730a3d01e83d70037e0d1e6eec7d2255a45f99e3144d55b4d3bc2d7366216b4f\": rpc error: code = NotFound desc = could not find container \"730a3d01e83d70037e0d1e6eec7d2255a45f99e3144d55b4d3bc2d7366216b4f\": container with ID starting with 730a3d01e83d70037e0d1e6eec7d2255a45f99e3144d55b4d3bc2d7366216b4f not found: ID does not exist" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.733176 4851 scope.go:117] "RemoveContainer" containerID="e24b7de9a6b59f816add29e99ed4c97b0171d8f00ee2b1591eb4620a52cff2dd" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.779919 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.779949 4851 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.779982 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:53 crc kubenswrapper[4851]: W1001 13:13:53.990579 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda82d9b2c_d132_4496_92f6_9fea17968b18.slice/crio-0ae9514cd0da7cac0dd81906df8159dbdedefee33a0d326288625f207e3110e1 WatchSource:0}: Error finding container 0ae9514cd0da7cac0dd81906df8159dbdedefee33a0d326288625f207e3110e1: Status 404 returned error can't find the container with id 0ae9514cd0da7cac0dd81906df8159dbdedefee33a0d326288625f207e3110e1 Oct 01 13:13:53 crc kubenswrapper[4851]: I1001 13:13:53.997225 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-788dbb8495-dp5gr"] Oct 01 13:13:54 crc kubenswrapper[4851]: I1001 13:13:54.173126 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6856b75994-rphb7"] Oct 01 13:13:54 crc kubenswrapper[4851]: W1001 13:13:54.175647 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff9726c5_9ce7_444b_9dfd_9b014835c375.slice/crio-b24bb130c93518b165533525734202bb0ecabcb180a5821f5ed6d635b57c777a WatchSource:0}: Error finding container b24bb130c93518b165533525734202bb0ecabcb180a5821f5ed6d635b57c777a: Status 404 returned error can't find the container with id b24bb130c93518b165533525734202bb0ecabcb180a5821f5ed6d635b57c777a Oct 01 13:13:54 crc kubenswrapper[4851]: I1001 13:13:54.259176 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d994585bc-s9wp6"] Oct 01 13:13:54 crc kubenswrapper[4851]: I1001 13:13:54.263877 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d994585bc-s9wp6"] Oct 01 13:13:54 crc kubenswrapper[4851]: I1001 13:13:54.355935 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" path="/var/lib/kubelet/pods/f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4/volumes" Oct 01 13:13:54 crc kubenswrapper[4851]: I1001 13:13:54.676656 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6856b75994-rphb7" event={"ID":"ff9726c5-9ce7-444b-9dfd-9b014835c375","Type":"ContainerStarted","Data":"d0c62f9449fdd155f8a797dfa16bc07b25c56ac496c075ae5870c1b0433eb21c"} Oct 01 13:13:54 crc kubenswrapper[4851]: I1001 13:13:54.676694 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6856b75994-rphb7" event={"ID":"ff9726c5-9ce7-444b-9dfd-9b014835c375","Type":"ContainerStarted","Data":"c1c97861f50ca0b69a42a4188b7b857ad0cc56dbbbaf4006b1fffd088aa3e99d"} Oct 01 13:13:54 crc kubenswrapper[4851]: I1001 13:13:54.676775 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6856b75994-rphb7" event={"ID":"ff9726c5-9ce7-444b-9dfd-9b014835c375","Type":"ContainerStarted","Data":"b24bb130c93518b165533525734202bb0ecabcb180a5821f5ed6d635b57c777a"} Oct 01 13:13:54 crc kubenswrapper[4851]: I1001 13:13:54.677274 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:13:54 crc kubenswrapper[4851]: I1001 13:13:54.678227 4851 generic.go:334] "Generic (PLEG): container finished" podID="a82d9b2c-d132-4496-92f6-9fea17968b18" containerID="6174c46436f76d97618e19239c65323c1c2e5203410f72bb5ba7d3d89c2d5413" exitCode=0 Oct 01 13:13:54 crc kubenswrapper[4851]: I1001 13:13:54.678526 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" event={"ID":"a82d9b2c-d132-4496-92f6-9fea17968b18","Type":"ContainerDied","Data":"6174c46436f76d97618e19239c65323c1c2e5203410f72bb5ba7d3d89c2d5413"} Oct 01 13:13:54 crc kubenswrapper[4851]: I1001 13:13:54.678579 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" event={"ID":"a82d9b2c-d132-4496-92f6-9fea17968b18","Type":"ContainerStarted","Data":"0ae9514cd0da7cac0dd81906df8159dbdedefee33a0d326288625f207e3110e1"} Oct 01 13:13:54 crc kubenswrapper[4851]: I1001 13:13:54.700278 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6856b75994-rphb7" podStartSLOduration=2.7002601 podStartE2EDuration="2.7002601s" podCreationTimestamp="2025-10-01 13:13:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:54.697969584 +0000 UTC m=+1243.043087080" watchObservedRunningTime="2025-10-01 13:13:54.7002601 +0000 UTC m=+1243.045377586" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.099565 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57698c5d89-m6vxz"] Oct 01 13:13:55 crc kubenswrapper[4851]: E1001 13:13:55.100245 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" containerName="dnsmasq-dns" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.100262 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" containerName="dnsmasq-dns" Oct 01 13:13:55 crc kubenswrapper[4851]: E1001 13:13:55.100307 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" containerName="init" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.100315 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" containerName="init" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.100597 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1baf8f0-b24f-4dfd-a6fb-3e4e578c11f4" containerName="dnsmasq-dns" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.105589 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.108303 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.108418 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.125003 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57698c5d89-m6vxz"] Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.208472 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-combined-ca-bundle\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.208546 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-ovndb-tls-certs\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.208588 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-public-tls-certs\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.208609 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-config\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.208846 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chm7p\" (UniqueName: \"kubernetes.io/projected/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-kube-api-access-chm7p\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.208922 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-httpd-config\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.208959 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-internal-tls-certs\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.246940 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.247000 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.247017 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.247030 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.248001 4851 scope.go:117] "RemoveContainer" containerID="c20a7b78f4412c0f21c222ebc0f47ef6d52387a25f64c9b0748dd8ea538d34b0" Oct 01 13:13:55 crc kubenswrapper[4851]: E1001 13:13:55.248272 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(971ca0ac-6de7-42f1-bf29-5174fd80ced4)\"" pod="openstack/watcher-decision-engine-0" podUID="971ca0ac-6de7-42f1-bf29-5174fd80ced4" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.277640 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.311645 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chm7p\" (UniqueName: \"kubernetes.io/projected/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-kube-api-access-chm7p\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.311694 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-httpd-config\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.311734 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-internal-tls-certs\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.311822 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-combined-ca-bundle\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.311857 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-ovndb-tls-certs\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.311904 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-public-tls-certs\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.311920 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-config\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.321184 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-ovndb-tls-certs\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.322804 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-config\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.323142 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-httpd-config\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.319240 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-public-tls-certs\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.326067 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-internal-tls-certs\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.329458 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-combined-ca-bundle\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.352891 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chm7p\" (UniqueName: \"kubernetes.io/projected/0f07dc57-fd61-4799-8658-3ed1fcc9f01c-kube-api-access-chm7p\") pod \"neutron-57698c5d89-m6vxz\" (UID: \"0f07dc57-fd61-4799-8658-3ed1fcc9f01c\") " pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.364418 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.455973 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.697393 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1179c5d5-fcf4-4732-959f-852ba78d2829" containerName="cinder-scheduler" containerID="cri-o://8b3ed9c559569c38c9c5c3d75695afc9377fbc41354cada12bfd6710412aac8b" gracePeriod=30 Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.698404 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" event={"ID":"a82d9b2c-d132-4496-92f6-9fea17968b18","Type":"ContainerStarted","Data":"4fa2acb1d0369e23bb7b9f896175d9a71261580a50338f3228baddf6bc79fee1"} Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.698431 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.699073 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1179c5d5-fcf4-4732-959f-852ba78d2829" containerName="probe" containerID="cri-o://0e6c499764f6c34718bc6b514bd73d540b331e4dcd4d95de4bcd842bbee792d2" gracePeriod=30 Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.765726 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.771167 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7c6fb58db4-9tzr2" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.788840 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" podStartSLOduration=3.788825193 podStartE2EDuration="3.788825193s" podCreationTimestamp="2025-10-01 13:13:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:55.727563755 +0000 UTC m=+1244.072681251" watchObservedRunningTime="2025-10-01 13:13:55.788825193 +0000 UTC m=+1244.133942679" Oct 01 13:13:55 crc kubenswrapper[4851]: I1001 13:13:55.865986 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67b666754b-b52ns" podUID="1db63449-71cb-4fa2-86db-43a83a914643" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.160:8443: connect: connection refused" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.063527 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57698c5d89-m6vxz"] Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.356180 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.357552 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.357636 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.372983 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-q2gv9" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.373135 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.373357 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.445747 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f\") " pod="openstack/openstackclient" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.445803 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f-openstack-config\") pod \"openstackclient\" (UID: \"3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f\") " pod="openstack/openstackclient" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.445896 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw8md\" (UniqueName: \"kubernetes.io/projected/3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f-kube-api-access-fw8md\") pod \"openstackclient\" (UID: \"3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f\") " pod="openstack/openstackclient" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.446039 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f\") " pod="openstack/openstackclient" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.548344 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f\") " pod="openstack/openstackclient" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.548426 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f-openstack-config\") pod \"openstackclient\" (UID: \"3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f\") " pod="openstack/openstackclient" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.548489 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw8md\" (UniqueName: \"kubernetes.io/projected/3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f-kube-api-access-fw8md\") pod \"openstackclient\" (UID: \"3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f\") " pod="openstack/openstackclient" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.548540 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f\") " pod="openstack/openstackclient" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.550078 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f-openstack-config\") pod \"openstackclient\" (UID: \"3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f\") " pod="openstack/openstackclient" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.553115 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f\") " pod="openstack/openstackclient" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.553146 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f\") " pod="openstack/openstackclient" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.570250 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw8md\" (UniqueName: \"kubernetes.io/projected/3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f-kube-api-access-fw8md\") pod \"openstackclient\" (UID: \"3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f\") " pod="openstack/openstackclient" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.713977 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57698c5d89-m6vxz" event={"ID":"0f07dc57-fd61-4799-8658-3ed1fcc9f01c","Type":"ContainerStarted","Data":"928de7839b00fc30a6689e682ba0f8a700d5c4ee56cf6ff151341a828f55c36f"} Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.714031 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57698c5d89-m6vxz" event={"ID":"0f07dc57-fd61-4799-8658-3ed1fcc9f01c","Type":"ContainerStarted","Data":"094f05f4afd231fd1971eef4ac6d7d18b4130fb55771da2b4aed18b2f777ffe7"} Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.714045 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57698c5d89-m6vxz" event={"ID":"0f07dc57-fd61-4799-8658-3ed1fcc9f01c","Type":"ContainerStarted","Data":"4e94a17e08f60918233772fa107656aca06b4421e20166ac598920fc33a63a56"} Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.715394 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.718679 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1179c5d5-fcf4-4732-959f-852ba78d2829","Type":"ContainerDied","Data":"0e6c499764f6c34718bc6b514bd73d540b331e4dcd4d95de4bcd842bbee792d2"} Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.718481 4851 generic.go:334] "Generic (PLEG): container finished" podID="1179c5d5-fcf4-4732-959f-852ba78d2829" containerID="0e6c499764f6c34718bc6b514bd73d540b331e4dcd4d95de4bcd842bbee792d2" exitCode=0 Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.748149 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57698c5d89-m6vxz" podStartSLOduration=1.748130556 podStartE2EDuration="1.748130556s" podCreationTimestamp="2025-10-01 13:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:13:56.737961276 +0000 UTC m=+1245.083078762" watchObservedRunningTime="2025-10-01 13:13:56.748130556 +0000 UTC m=+1245.093248032" Oct 01 13:13:56 crc kubenswrapper[4851]: I1001 13:13:56.782025 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 13:13:57 crc kubenswrapper[4851]: I1001 13:13:57.246707 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 13:13:57 crc kubenswrapper[4851]: I1001 13:13:57.733669 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f","Type":"ContainerStarted","Data":"23a68050217fcb8affde23a2f7edde38e89de6d82f86eac7ae2453fa368fe63f"} Oct 01 13:13:58 crc kubenswrapper[4851]: I1001 13:13:58.750224 4851 generic.go:334] "Generic (PLEG): container finished" podID="1179c5d5-fcf4-4732-959f-852ba78d2829" containerID="8b3ed9c559569c38c9c5c3d75695afc9377fbc41354cada12bfd6710412aac8b" exitCode=0 Oct 01 13:13:58 crc kubenswrapper[4851]: I1001 13:13:58.750292 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1179c5d5-fcf4-4732-959f-852ba78d2829","Type":"ContainerDied","Data":"8b3ed9c559569c38c9c5c3d75695afc9377fbc41354cada12bfd6710412aac8b"} Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.265440 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.343260 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-config-data\") pod \"1179c5d5-fcf4-4732-959f-852ba78d2829\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.343332 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1179c5d5-fcf4-4732-959f-852ba78d2829-etc-machine-id\") pod \"1179c5d5-fcf4-4732-959f-852ba78d2829\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.343430 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-config-data-custom\") pod \"1179c5d5-fcf4-4732-959f-852ba78d2829\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.343457 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpwt2\" (UniqueName: \"kubernetes.io/projected/1179c5d5-fcf4-4732-959f-852ba78d2829-kube-api-access-kpwt2\") pod \"1179c5d5-fcf4-4732-959f-852ba78d2829\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.343468 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1179c5d5-fcf4-4732-959f-852ba78d2829-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1179c5d5-fcf4-4732-959f-852ba78d2829" (UID: "1179c5d5-fcf4-4732-959f-852ba78d2829"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.343583 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-combined-ca-bundle\") pod \"1179c5d5-fcf4-4732-959f-852ba78d2829\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.343673 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-scripts\") pod \"1179c5d5-fcf4-4732-959f-852ba78d2829\" (UID: \"1179c5d5-fcf4-4732-959f-852ba78d2829\") " Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.344048 4851 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1179c5d5-fcf4-4732-959f-852ba78d2829-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.349624 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1179c5d5-fcf4-4732-959f-852ba78d2829" (UID: "1179c5d5-fcf4-4732-959f-852ba78d2829"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.351118 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-scripts" (OuterVolumeSpecName: "scripts") pod "1179c5d5-fcf4-4732-959f-852ba78d2829" (UID: "1179c5d5-fcf4-4732-959f-852ba78d2829"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.351653 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1179c5d5-fcf4-4732-959f-852ba78d2829-kube-api-access-kpwt2" (OuterVolumeSpecName: "kube-api-access-kpwt2") pod "1179c5d5-fcf4-4732-959f-852ba78d2829" (UID: "1179c5d5-fcf4-4732-959f-852ba78d2829"). InnerVolumeSpecName "kube-api-access-kpwt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.423288 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1179c5d5-fcf4-4732-959f-852ba78d2829" (UID: "1179c5d5-fcf4-4732-959f-852ba78d2829"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.447078 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.447120 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.447138 4851 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.447151 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpwt2\" (UniqueName: \"kubernetes.io/projected/1179c5d5-fcf4-4732-959f-852ba78d2829-kube-api-access-kpwt2\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.496162 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-config-data" (OuterVolumeSpecName: "config-data") pod "1179c5d5-fcf4-4732-959f-852ba78d2829" (UID: "1179c5d5-fcf4-4732-959f-852ba78d2829"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.550301 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179c5d5-fcf4-4732-959f-852ba78d2829-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.767479 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1179c5d5-fcf4-4732-959f-852ba78d2829","Type":"ContainerDied","Data":"6575908806aff04738d95c27ec0663f60f81f0342cabc66d6f8b88d3aa205151"} Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.767559 4851 scope.go:117] "RemoveContainer" containerID="0e6c499764f6c34718bc6b514bd73d540b331e4dcd4d95de4bcd842bbee792d2" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.767699 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.798329 4851 scope.go:117] "RemoveContainer" containerID="8b3ed9c559569c38c9c5c3d75695afc9377fbc41354cada12bfd6710412aac8b" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.808143 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.816552 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.828658 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:13:59 crc kubenswrapper[4851]: E1001 13:13:59.829033 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1179c5d5-fcf4-4732-959f-852ba78d2829" containerName="cinder-scheduler" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.829050 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1179c5d5-fcf4-4732-959f-852ba78d2829" containerName="cinder-scheduler" Oct 01 13:13:59 crc kubenswrapper[4851]: E1001 13:13:59.829074 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1179c5d5-fcf4-4732-959f-852ba78d2829" containerName="probe" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.829080 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1179c5d5-fcf4-4732-959f-852ba78d2829" containerName="probe" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.829278 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="1179c5d5-fcf4-4732-959f-852ba78d2829" containerName="cinder-scheduler" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.829301 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="1179c5d5-fcf4-4732-959f-852ba78d2829" containerName="probe" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.830238 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.832119 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.845923 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.956859 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjll8\" (UniqueName: \"kubernetes.io/projected/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-kube-api-access-wjll8\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.956927 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-scripts\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.956952 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.956977 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.957135 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-config-data\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:13:59 crc kubenswrapper[4851]: I1001 13:13:59.957212 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.059275 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-config-data\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.059328 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.059424 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjll8\" (UniqueName: \"kubernetes.io/projected/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-kube-api-access-wjll8\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.059464 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-scripts\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.059486 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.059521 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.059826 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.069921 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.069962 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.080588 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-config-data\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.091429 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-scripts\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.096037 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjll8\" (UniqueName: \"kubernetes.io/projected/dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9-kube-api-access-wjll8\") pod \"cinder-scheduler-0\" (UID: \"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9\") " pod="openstack/cinder-scheduler-0" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.156090 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.338860 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1179c5d5-fcf4-4732-959f-852ba78d2829" path="/var/lib/kubelet/pods/1179c5d5-fcf4-4732-959f-852ba78d2829/volumes" Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.733281 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:14:00 crc kubenswrapper[4851]: W1001 13:14:00.744820 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcbf27cf_3d1e_48c8_89f2_ac85568c6ae9.slice/crio-459a156e0e9ccc0da2d5a1fe9fecef1eab53a98c9de9d568935ffddd6f91c094 WatchSource:0}: Error finding container 459a156e0e9ccc0da2d5a1fe9fecef1eab53a98c9de9d568935ffddd6f91c094: Status 404 returned error can't find the container with id 459a156e0e9ccc0da2d5a1fe9fecef1eab53a98c9de9d568935ffddd6f91c094 Oct 01 13:14:00 crc kubenswrapper[4851]: I1001 13:14:00.781251 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9","Type":"ContainerStarted","Data":"459a156e0e9ccc0da2d5a1fe9fecef1eab53a98c9de9d568935ffddd6f91c094"} Oct 01 13:14:01 crc kubenswrapper[4851]: I1001 13:14:01.799185 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9","Type":"ContainerStarted","Data":"181e56c9876a0e0a330a602ff3770f1d138aac1d19873c5e765764ce9ae812eb"} Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.395600 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.801011 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-b55c56cb7-m2hqv"] Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.805241 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.808847 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.809041 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.809200 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.819512 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9","Type":"ContainerStarted","Data":"f65cdb579d92665dc706c1dfc01fe3f509390169f4d361fdd778770517297528"} Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.844277 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-b55c56cb7-m2hqv"] Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.865246 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.865230481 podStartE2EDuration="3.865230481s" podCreationTimestamp="2025-10-01 13:13:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:14:02.859154547 +0000 UTC m=+1251.204272043" watchObservedRunningTime="2025-10-01 13:14:02.865230481 +0000 UTC m=+1251.210347967" Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.926381 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39d289ff-07b0-479b-8f7a-bb1e5c108be2-log-httpd\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.926481 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39d289ff-07b0-479b-8f7a-bb1e5c108be2-etc-swift\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.926648 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39d289ff-07b0-479b-8f7a-bb1e5c108be2-internal-tls-certs\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.926691 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39d289ff-07b0-479b-8f7a-bb1e5c108be2-run-httpd\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.926720 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6ccm\" (UniqueName: \"kubernetes.io/projected/39d289ff-07b0-479b-8f7a-bb1e5c108be2-kube-api-access-p6ccm\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.926821 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39d289ff-07b0-479b-8f7a-bb1e5c108be2-config-data\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.926984 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39d289ff-07b0-479b-8f7a-bb1e5c108be2-public-tls-certs\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:02 crc kubenswrapper[4851]: I1001 13:14:02.927088 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39d289ff-07b0-479b-8f7a-bb1e5c108be2-combined-ca-bundle\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.028260 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39d289ff-07b0-479b-8f7a-bb1e5c108be2-etc-swift\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.028314 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39d289ff-07b0-479b-8f7a-bb1e5c108be2-internal-tls-certs\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.028335 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39d289ff-07b0-479b-8f7a-bb1e5c108be2-run-httpd\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.028354 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ccm\" (UniqueName: \"kubernetes.io/projected/39d289ff-07b0-479b-8f7a-bb1e5c108be2-kube-api-access-p6ccm\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.028384 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39d289ff-07b0-479b-8f7a-bb1e5c108be2-config-data\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.028427 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39d289ff-07b0-479b-8f7a-bb1e5c108be2-public-tls-certs\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.028814 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39d289ff-07b0-479b-8f7a-bb1e5c108be2-run-httpd\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.029409 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39d289ff-07b0-479b-8f7a-bb1e5c108be2-combined-ca-bundle\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.029894 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39d289ff-07b0-479b-8f7a-bb1e5c108be2-log-httpd\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.030200 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39d289ff-07b0-479b-8f7a-bb1e5c108be2-log-httpd\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.036122 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39d289ff-07b0-479b-8f7a-bb1e5c108be2-combined-ca-bundle\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.038277 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39d289ff-07b0-479b-8f7a-bb1e5c108be2-public-tls-certs\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.043609 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39d289ff-07b0-479b-8f7a-bb1e5c108be2-config-data\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.044879 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39d289ff-07b0-479b-8f7a-bb1e5c108be2-internal-tls-certs\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.046583 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6ccm\" (UniqueName: \"kubernetes.io/projected/39d289ff-07b0-479b-8f7a-bb1e5c108be2-kube-api-access-p6ccm\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.046955 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39d289ff-07b0-479b-8f7a-bb1e5c108be2-etc-swift\") pod \"swift-proxy-b55c56cb7-m2hqv\" (UID: \"39d289ff-07b0-479b-8f7a-bb1e5c108be2\") " pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.122088 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.412424 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.479612 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-956c876c5-cvlqz"] Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.479849 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-956c876c5-cvlqz" podUID="97ea855a-7426-4e7d-af3d-c7d498622629" containerName="dnsmasq-dns" containerID="cri-o://6597334ee09778a6d3effb76f5dbb9fec5d03ff96d7aa500cb7462e5a64f260a" gracePeriod=10 Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.706228 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.706731 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="ceilometer-central-agent" containerID="cri-o://9c11cab70fb348044092c4a0b61186fefb99429ff5a33d4a419303a0d7ad7432" gracePeriod=30 Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.706846 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="proxy-httpd" containerID="cri-o://1ed97d7f4b48d3b9b7a83d8c46053ac262a4889712ef898ef29dd2503552c0bb" gracePeriod=30 Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.706882 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="sg-core" containerID="cri-o://454cdbc95cfa935e30954f494ccea16c320b736804108fbf89fe7f53ac0430ad" gracePeriod=30 Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.706920 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="ceilometer-notification-agent" containerID="cri-o://31cb5f3156b36394319f6015cd6472af26f2f82f94a3c2720b9a8746f95387dc" gracePeriod=30 Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.738302 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.176:3000/\": EOF" Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.837349 4851 generic.go:334] "Generic (PLEG): container finished" podID="97ea855a-7426-4e7d-af3d-c7d498622629" containerID="6597334ee09778a6d3effb76f5dbb9fec5d03ff96d7aa500cb7462e5a64f260a" exitCode=0 Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.837413 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-956c876c5-cvlqz" event={"ID":"97ea855a-7426-4e7d-af3d-c7d498622629","Type":"ContainerDied","Data":"6597334ee09778a6d3effb76f5dbb9fec5d03ff96d7aa500cb7462e5a64f260a"} Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.850679 4851 generic.go:334] "Generic (PLEG): container finished" podID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerID="454cdbc95cfa935e30954f494ccea16c320b736804108fbf89fe7f53ac0430ad" exitCode=2 Oct 01 13:14:03 crc kubenswrapper[4851]: I1001 13:14:03.850717 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"973a48f4-ab11-4c1d-b358-36c986b44f8c","Type":"ContainerDied","Data":"454cdbc95cfa935e30954f494ccea16c320b736804108fbf89fe7f53ac0430ad"} Oct 01 13:14:04 crc kubenswrapper[4851]: I1001 13:14:04.866196 4851 generic.go:334] "Generic (PLEG): container finished" podID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerID="1ed97d7f4b48d3b9b7a83d8c46053ac262a4889712ef898ef29dd2503552c0bb" exitCode=0 Oct 01 13:14:04 crc kubenswrapper[4851]: I1001 13:14:04.866224 4851 generic.go:334] "Generic (PLEG): container finished" podID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerID="9c11cab70fb348044092c4a0b61186fefb99429ff5a33d4a419303a0d7ad7432" exitCode=0 Oct 01 13:14:04 crc kubenswrapper[4851]: I1001 13:14:04.866242 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"973a48f4-ab11-4c1d-b358-36c986b44f8c","Type":"ContainerDied","Data":"1ed97d7f4b48d3b9b7a83d8c46053ac262a4889712ef898ef29dd2503552c0bb"} Oct 01 13:14:04 crc kubenswrapper[4851]: I1001 13:14:04.866267 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"973a48f4-ab11-4c1d-b358-36c986b44f8c","Type":"ContainerDied","Data":"9c11cab70fb348044092c4a0b61186fefb99429ff5a33d4a419303a0d7ad7432"} Oct 01 13:14:05 crc kubenswrapper[4851]: I1001 13:14:05.157009 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 13:14:05 crc kubenswrapper[4851]: I1001 13:14:05.328467 4851 scope.go:117] "RemoveContainer" containerID="c20a7b78f4412c0f21c222ebc0f47ef6d52387a25f64c9b0748dd8ea538d34b0" Oct 01 13:14:05 crc kubenswrapper[4851]: E1001 13:14:05.328713 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(971ca0ac-6de7-42f1-bf29-5174fd80ced4)\"" pod="openstack/watcher-decision-engine-0" podUID="971ca0ac-6de7-42f1-bf29-5174fd80ced4" Oct 01 13:14:05 crc kubenswrapper[4851]: I1001 13:14:05.866917 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67b666754b-b52ns" podUID="1db63449-71cb-4fa2-86db-43a83a914643" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.160:8443: connect: connection refused" Oct 01 13:14:06 crc kubenswrapper[4851]: I1001 13:14:06.519025 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-956c876c5-cvlqz" podUID="97ea855a-7426-4e7d-af3d-c7d498622629" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.172:5353: connect: connection refused" Oct 01 13:14:06 crc kubenswrapper[4851]: I1001 13:14:06.886574 4851 generic.go:334] "Generic (PLEG): container finished" podID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerID="31cb5f3156b36394319f6015cd6472af26f2f82f94a3c2720b9a8746f95387dc" exitCode=0 Oct 01 13:14:06 crc kubenswrapper[4851]: I1001 13:14:06.886625 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"973a48f4-ab11-4c1d-b358-36c986b44f8c","Type":"ContainerDied","Data":"31cb5f3156b36394319f6015cd6472af26f2f82f94a3c2720b9a8746f95387dc"} Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.695682 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.797576 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.801952 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="556db500-a192-4dfa-8a7f-4ab1d1e7e15b" containerName="glance-httpd" containerID="cri-o://3c62828291e0d54dc121908d792d7badaec5beb089700b9e32d232ebbc2d7ccd" gracePeriod=30 Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.801641 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="556db500-a192-4dfa-8a7f-4ab1d1e7e15b" containerName="glance-log" containerID="cri-o://e64190c7410f428cda3cb53f01f5f24b91b5ecd57305ce3aaaffc33f3c6acb7e" gracePeriod=30 Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.813266 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.876574 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-ovsdbserver-nb\") pod \"97ea855a-7426-4e7d-af3d-c7d498622629\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.876880 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-dns-svc\") pod \"97ea855a-7426-4e7d-af3d-c7d498622629\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.876964 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-dns-swift-storage-0\") pod \"97ea855a-7426-4e7d-af3d-c7d498622629\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.877084 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf6pf\" (UniqueName: \"kubernetes.io/projected/97ea855a-7426-4e7d-af3d-c7d498622629-kube-api-access-qf6pf\") pod \"97ea855a-7426-4e7d-af3d-c7d498622629\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.877184 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-config\") pod \"97ea855a-7426-4e7d-af3d-c7d498622629\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.877318 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-ovsdbserver-sb\") pod \"97ea855a-7426-4e7d-af3d-c7d498622629\" (UID: \"97ea855a-7426-4e7d-af3d-c7d498622629\") " Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.884429 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ea855a-7426-4e7d-af3d-c7d498622629-kube-api-access-qf6pf" (OuterVolumeSpecName: "kube-api-access-qf6pf") pod "97ea855a-7426-4e7d-af3d-c7d498622629" (UID: "97ea855a-7426-4e7d-af3d-c7d498622629"). InnerVolumeSpecName "kube-api-access-qf6pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.938986 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-956c876c5-cvlqz" event={"ID":"97ea855a-7426-4e7d-af3d-c7d498622629","Type":"ContainerDied","Data":"a003ca1070f8f6c76a127673f3fc7252102d62cd280ad08014eea7f85526f4d3"} Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.939042 4851 scope.go:117] "RemoveContainer" containerID="6597334ee09778a6d3effb76f5dbb9fec5d03ff96d7aa500cb7462e5a64f260a" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.939187 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-956c876c5-cvlqz" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.972177 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"973a48f4-ab11-4c1d-b358-36c986b44f8c","Type":"ContainerDied","Data":"7547d91f8a340e368c10f0d6466101cc3c9b26345a464fe5c003cd886e0b60cf"} Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.972285 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.972553 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "97ea855a-7426-4e7d-af3d-c7d498622629" (UID: "97ea855a-7426-4e7d-af3d-c7d498622629"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.974426 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f","Type":"ContainerStarted","Data":"d549d8ccf6a5943b7826c967e43e5a92a0fe5ecab4a42d8e1845e7b7878fd984"} Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.979649 4851 scope.go:117] "RemoveContainer" containerID="eff1eea81bdcd588724281cd7d2b7364d414e44be86ee8e76d18498267e2e2e8" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.979991 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/973a48f4-ab11-4c1d-b358-36c986b44f8c-run-httpd\") pod \"973a48f4-ab11-4c1d-b358-36c986b44f8c\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.980110 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-scripts\") pod \"973a48f4-ab11-4c1d-b358-36c986b44f8c\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.980160 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5bj8\" (UniqueName: \"kubernetes.io/projected/973a48f4-ab11-4c1d-b358-36c986b44f8c-kube-api-access-p5bj8\") pod \"973a48f4-ab11-4c1d-b358-36c986b44f8c\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.980211 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/973a48f4-ab11-4c1d-b358-36c986b44f8c-log-httpd\") pod \"973a48f4-ab11-4c1d-b358-36c986b44f8c\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.980230 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-combined-ca-bundle\") pod \"973a48f4-ab11-4c1d-b358-36c986b44f8c\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.980347 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-sg-core-conf-yaml\") pod \"973a48f4-ab11-4c1d-b358-36c986b44f8c\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.980374 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-config-data\") pod \"973a48f4-ab11-4c1d-b358-36c986b44f8c\" (UID: \"973a48f4-ab11-4c1d-b358-36c986b44f8c\") " Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.981442 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/973a48f4-ab11-4c1d-b358-36c986b44f8c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "973a48f4-ab11-4c1d-b358-36c986b44f8c" (UID: "973a48f4-ab11-4c1d-b358-36c986b44f8c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.984748 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/973a48f4-ab11-4c1d-b358-36c986b44f8c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "973a48f4-ab11-4c1d-b358-36c986b44f8c" (UID: "973a48f4-ab11-4c1d-b358-36c986b44f8c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.986052 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-scripts" (OuterVolumeSpecName: "scripts") pod "973a48f4-ab11-4c1d-b358-36c986b44f8c" (UID: "973a48f4-ab11-4c1d-b358-36c986b44f8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.994320 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.994356 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.994367 4851 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/973a48f4-ab11-4c1d-b358-36c986b44f8c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.994377 4851 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/973a48f4-ab11-4c1d-b358-36c986b44f8c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:08 crc kubenswrapper[4851]: I1001 13:14:08.994386 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf6pf\" (UniqueName: \"kubernetes.io/projected/97ea855a-7426-4e7d-af3d-c7d498622629-kube-api-access-qf6pf\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.010053 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/973a48f4-ab11-4c1d-b358-36c986b44f8c-kube-api-access-p5bj8" (OuterVolumeSpecName: "kube-api-access-p5bj8") pod "973a48f4-ab11-4c1d-b358-36c986b44f8c" (UID: "973a48f4-ab11-4c1d-b358-36c986b44f8c"). InnerVolumeSpecName "kube-api-access-p5bj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.017042 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.056719111 podStartE2EDuration="13.017022575s" podCreationTimestamp="2025-10-01 13:13:56 +0000 UTC" firstStartedPulling="2025-10-01 13:13:57.269075167 +0000 UTC m=+1245.614192643" lastFinishedPulling="2025-10-01 13:14:08.229378621 +0000 UTC m=+1256.574496107" observedRunningTime="2025-10-01 13:14:09.010260552 +0000 UTC m=+1257.355378038" watchObservedRunningTime="2025-10-01 13:14:09.017022575 +0000 UTC m=+1257.362140061" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.043736 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "97ea855a-7426-4e7d-af3d-c7d498622629" (UID: "97ea855a-7426-4e7d-af3d-c7d498622629"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.043872 4851 scope.go:117] "RemoveContainer" containerID="1ed97d7f4b48d3b9b7a83d8c46053ac262a4889712ef898ef29dd2503552c0bb" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.046118 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "97ea855a-7426-4e7d-af3d-c7d498622629" (UID: "97ea855a-7426-4e7d-af3d-c7d498622629"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.051841 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "973a48f4-ab11-4c1d-b358-36c986b44f8c" (UID: "973a48f4-ab11-4c1d-b358-36c986b44f8c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.059252 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-config" (OuterVolumeSpecName: "config") pod "97ea855a-7426-4e7d-af3d-c7d498622629" (UID: "97ea855a-7426-4e7d-af3d-c7d498622629"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.065928 4851 scope.go:117] "RemoveContainer" containerID="454cdbc95cfa935e30954f494ccea16c320b736804108fbf89fe7f53ac0430ad" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.076830 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "97ea855a-7426-4e7d-af3d-c7d498622629" (UID: "97ea855a-7426-4e7d-af3d-c7d498622629"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.085348 4851 scope.go:117] "RemoveContainer" containerID="31cb5f3156b36394319f6015cd6472af26f2f82f94a3c2720b9a8746f95387dc" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.093520 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-b55c56cb7-m2hqv"] Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.096273 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5bj8\" (UniqueName: \"kubernetes.io/projected/973a48f4-ab11-4c1d-b358-36c986b44f8c-kube-api-access-p5bj8\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.096518 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.096952 4851 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.096987 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.097000 4851 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.097011 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ea855a-7426-4e7d-af3d-c7d498622629-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.105475 4851 scope.go:117] "RemoveContainer" containerID="9c11cab70fb348044092c4a0b61186fefb99429ff5a33d4a419303a0d7ad7432" Oct 01 13:14:09 crc kubenswrapper[4851]: W1001 13:14:09.111550 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39d289ff_07b0_479b_8f7a_bb1e5c108be2.slice/crio-7bee0d1037c4655f1b8c3360fd03426dadec7c0ad3c6b189830440e45733a789 WatchSource:0}: Error finding container 7bee0d1037c4655f1b8c3360fd03426dadec7c0ad3c6b189830440e45733a789: Status 404 returned error can't find the container with id 7bee0d1037c4655f1b8c3360fd03426dadec7c0ad3c6b189830440e45733a789 Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.137037 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "973a48f4-ab11-4c1d-b358-36c986b44f8c" (UID: "973a48f4-ab11-4c1d-b358-36c986b44f8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.193554 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-config-data" (OuterVolumeSpecName: "config-data") pod "973a48f4-ab11-4c1d-b358-36c986b44f8c" (UID: "973a48f4-ab11-4c1d-b358-36c986b44f8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.200571 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.200808 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973a48f4-ab11-4c1d-b358-36c986b44f8c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.331934 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-956c876c5-cvlqz"] Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.346351 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-956c876c5-cvlqz"] Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.362512 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.374227 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.383933 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:09 crc kubenswrapper[4851]: E1001 13:14:09.384362 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="proxy-httpd" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.384380 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="proxy-httpd" Oct 01 13:14:09 crc kubenswrapper[4851]: E1001 13:14:09.384412 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ea855a-7426-4e7d-af3d-c7d498622629" containerName="init" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.384419 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ea855a-7426-4e7d-af3d-c7d498622629" containerName="init" Oct 01 13:14:09 crc kubenswrapper[4851]: E1001 13:14:09.384428 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="sg-core" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.384433 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="sg-core" Oct 01 13:14:09 crc kubenswrapper[4851]: E1001 13:14:09.384442 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="ceilometer-central-agent" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.384447 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="ceilometer-central-agent" Oct 01 13:14:09 crc kubenswrapper[4851]: E1001 13:14:09.384459 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="ceilometer-notification-agent" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.384466 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="ceilometer-notification-agent" Oct 01 13:14:09 crc kubenswrapper[4851]: E1001 13:14:09.384478 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ea855a-7426-4e7d-af3d-c7d498622629" containerName="dnsmasq-dns" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.384485 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ea855a-7426-4e7d-af3d-c7d498622629" containerName="dnsmasq-dns" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.384660 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="sg-core" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.384678 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="ceilometer-central-agent" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.384689 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="ceilometer-notification-agent" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.384702 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" containerName="proxy-httpd" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.384711 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ea855a-7426-4e7d-af3d-c7d498622629" containerName="dnsmasq-dns" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.386415 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.389016 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.389142 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.424454 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.512278 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-config-data\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.512321 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.512342 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.512363 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-scripts\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.512401 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-run-httpd\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.512423 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8m2h\" (UniqueName: \"kubernetes.io/projected/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-kube-api-access-l8m2h\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.512443 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-log-httpd\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.614470 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-scripts\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.614833 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-run-httpd\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.614856 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8m2h\" (UniqueName: \"kubernetes.io/projected/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-kube-api-access-l8m2h\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.614882 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-log-httpd\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.615004 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-config-data\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.615027 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.615059 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.620738 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-log-httpd\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.628328 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.631056 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-scripts\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.632382 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-config-data\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.635797 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-run-httpd\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.636096 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.639091 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8m2h\" (UniqueName: \"kubernetes.io/projected/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-kube-api-access-l8m2h\") pod \"ceilometer-0\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.712551 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.777482 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-sqnxr"] Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.786900 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sqnxr" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.787924 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-sqnxr"] Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.882304 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-cchtl"] Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.898778 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-cchtl" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.900051 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-cchtl"] Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.921333 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rftb\" (UniqueName: \"kubernetes.io/projected/936e4810-725f-42c8-95a2-4e307b5b7af5-kube-api-access-4rftb\") pod \"nova-api-db-create-sqnxr\" (UID: \"936e4810-725f-42c8-95a2-4e307b5b7af5\") " pod="openstack/nova-api-db-create-sqnxr" Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.990945 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-sd7q4"] Oct 01 13:14:09 crc kubenswrapper[4851]: I1001 13:14:09.992179 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sd7q4" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.003192 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-sd7q4"] Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.004459 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b55c56cb7-m2hqv" event={"ID":"39d289ff-07b0-479b-8f7a-bb1e5c108be2","Type":"ContainerStarted","Data":"4acd074be6c9471ffa1329faa26d8e7977a8fdbd4d1fffe703920e6ba0c7fc5b"} Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.004620 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b55c56cb7-m2hqv" event={"ID":"39d289ff-07b0-479b-8f7a-bb1e5c108be2","Type":"ContainerStarted","Data":"d5997ea751ea40a31946d42a4e10fb2c6a10327a42b6989334a787ac6be6a28f"} Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.004653 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b55c56cb7-m2hqv" event={"ID":"39d289ff-07b0-479b-8f7a-bb1e5c108be2","Type":"ContainerStarted","Data":"7bee0d1037c4655f1b8c3360fd03426dadec7c0ad3c6b189830440e45733a789"} Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.004665 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.004677 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.006971 4851 generic.go:334] "Generic (PLEG): container finished" podID="556db500-a192-4dfa-8a7f-4ab1d1e7e15b" containerID="e64190c7410f428cda3cb53f01f5f24b91b5ecd57305ce3aaaffc33f3c6acb7e" exitCode=143 Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.007035 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"556db500-a192-4dfa-8a7f-4ab1d1e7e15b","Type":"ContainerDied","Data":"e64190c7410f428cda3cb53f01f5f24b91b5ecd57305ce3aaaffc33f3c6acb7e"} Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.023653 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9fc6\" (UniqueName: \"kubernetes.io/projected/4eb25592-152d-4ad6-9fcc-46efaee35645-kube-api-access-w9fc6\") pod \"nova-cell0-db-create-cchtl\" (UID: \"4eb25592-152d-4ad6-9fcc-46efaee35645\") " pod="openstack/nova-cell0-db-create-cchtl" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.023718 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rftb\" (UniqueName: \"kubernetes.io/projected/936e4810-725f-42c8-95a2-4e307b5b7af5-kube-api-access-4rftb\") pod \"nova-api-db-create-sqnxr\" (UID: \"936e4810-725f-42c8-95a2-4e307b5b7af5\") " pod="openstack/nova-api-db-create-sqnxr" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.041841 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-b55c56cb7-m2hqv" podStartSLOduration=8.041825308 podStartE2EDuration="8.041825308s" podCreationTimestamp="2025-10-01 13:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:14:10.036605439 +0000 UTC m=+1258.381722915" watchObservedRunningTime="2025-10-01 13:14:10.041825308 +0000 UTC m=+1258.386942794" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.042812 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rftb\" (UniqueName: \"kubernetes.io/projected/936e4810-725f-42c8-95a2-4e307b5b7af5-kube-api-access-4rftb\") pod \"nova-api-db-create-sqnxr\" (UID: \"936e4810-725f-42c8-95a2-4e307b5b7af5\") " pod="openstack/nova-api-db-create-sqnxr" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.087186 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.087521 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="39c1516d-1f7f-4814-b712-cfa3355ded27" containerName="glance-log" containerID="cri-o://81d8977f210e5feb5373c99f20ced4dbc1d5da428cccf180185431132777899a" gracePeriod=30 Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.088008 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="39c1516d-1f7f-4814-b712-cfa3355ded27" containerName="glance-httpd" containerID="cri-o://bf2cda50c2b2b074f7d44b4f289e0e262020b11056db7c66dad943b5eed457fb" gracePeriod=30 Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.110003 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sqnxr" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.125041 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9fc6\" (UniqueName: \"kubernetes.io/projected/4eb25592-152d-4ad6-9fcc-46efaee35645-kube-api-access-w9fc6\") pod \"nova-cell0-db-create-cchtl\" (UID: \"4eb25592-152d-4ad6-9fcc-46efaee35645\") " pod="openstack/nova-cell0-db-create-cchtl" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.125100 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnhzs\" (UniqueName: \"kubernetes.io/projected/3d926f8f-83f5-470e-9dba-d2b3478d9dae-kube-api-access-dnhzs\") pod \"nova-cell1-db-create-sd7q4\" (UID: \"3d926f8f-83f5-470e-9dba-d2b3478d9dae\") " pod="openstack/nova-cell1-db-create-sd7q4" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.147519 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9fc6\" (UniqueName: \"kubernetes.io/projected/4eb25592-152d-4ad6-9fcc-46efaee35645-kube-api-access-w9fc6\") pod \"nova-cell0-db-create-cchtl\" (UID: \"4eb25592-152d-4ad6-9fcc-46efaee35645\") " pod="openstack/nova-cell0-db-create-cchtl" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.224940 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-cchtl" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.227914 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnhzs\" (UniqueName: \"kubernetes.io/projected/3d926f8f-83f5-470e-9dba-d2b3478d9dae-kube-api-access-dnhzs\") pod \"nova-cell1-db-create-sd7q4\" (UID: \"3d926f8f-83f5-470e-9dba-d2b3478d9dae\") " pod="openstack/nova-cell1-db-create-sd7q4" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.245090 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnhzs\" (UniqueName: \"kubernetes.io/projected/3d926f8f-83f5-470e-9dba-d2b3478d9dae-kube-api-access-dnhzs\") pod \"nova-cell1-db-create-sd7q4\" (UID: \"3d926f8f-83f5-470e-9dba-d2b3478d9dae\") " pod="openstack/nova-cell1-db-create-sd7q4" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.286552 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:10 crc kubenswrapper[4851]: W1001 13:14:10.304642 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7909b1e5_aeb3_42a4_85d0_4d3c142fab35.slice/crio-0db66f6c2154e146023835b0a1a8bc7be7b3820dcc8368d32ff9648c28343b3b WatchSource:0}: Error finding container 0db66f6c2154e146023835b0a1a8bc7be7b3820dcc8368d32ff9648c28343b3b: Status 404 returned error can't find the container with id 0db66f6c2154e146023835b0a1a8bc7be7b3820dcc8368d32ff9648c28343b3b Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.317061 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sd7q4" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.372853 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="973a48f4-ab11-4c1d-b358-36c986b44f8c" path="/var/lib/kubelet/pods/973a48f4-ab11-4c1d-b358-36c986b44f8c/volumes" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.373791 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ea855a-7426-4e7d-af3d-c7d498622629" path="/var/lib/kubelet/pods/97ea855a-7426-4e7d-af3d-c7d498622629/volumes" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.425069 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.822182 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-sqnxr"] Oct 01 13:14:10 crc kubenswrapper[4851]: W1001 13:14:10.880628 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod936e4810_725f_42c8_95a2_4e307b5b7af5.slice/crio-01dcb7d422ff5b8ba9c089696042c7c0f702efda829e7d765a133178451a8d6b WatchSource:0}: Error finding container 01dcb7d422ff5b8ba9c089696042c7c0f702efda829e7d765a133178451a8d6b: Status 404 returned error can't find the container with id 01dcb7d422ff5b8ba9c089696042c7c0f702efda829e7d765a133178451a8d6b Oct 01 13:14:10 crc kubenswrapper[4851]: I1001 13:14:10.925265 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.021770 4851 generic.go:334] "Generic (PLEG): container finished" podID="39c1516d-1f7f-4814-b712-cfa3355ded27" containerID="bf2cda50c2b2b074f7d44b4f289e0e262020b11056db7c66dad943b5eed457fb" exitCode=0 Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.021795 4851 generic.go:334] "Generic (PLEG): container finished" podID="39c1516d-1f7f-4814-b712-cfa3355ded27" containerID="81d8977f210e5feb5373c99f20ced4dbc1d5da428cccf180185431132777899a" exitCode=143 Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.021827 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"39c1516d-1f7f-4814-b712-cfa3355ded27","Type":"ContainerDied","Data":"bf2cda50c2b2b074f7d44b4f289e0e262020b11056db7c66dad943b5eed457fb"} Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.021852 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"39c1516d-1f7f-4814-b712-cfa3355ded27","Type":"ContainerDied","Data":"81d8977f210e5feb5373c99f20ced4dbc1d5da428cccf180185431132777899a"} Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.024374 4851 generic.go:334] "Generic (PLEG): container finished" podID="1db63449-71cb-4fa2-86db-43a83a914643" containerID="c39866bdf066b1da181c5a3017d9775a7100cc6f4794767d4519b16f3085d3af" exitCode=137 Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.024413 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b666754b-b52ns" event={"ID":"1db63449-71cb-4fa2-86db-43a83a914643","Type":"ContainerDied","Data":"c39866bdf066b1da181c5a3017d9775a7100cc6f4794767d4519b16f3085d3af"} Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.024429 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b666754b-b52ns" event={"ID":"1db63449-71cb-4fa2-86db-43a83a914643","Type":"ContainerDied","Data":"a0d437c9c4f231e21a8f5468d45f8371f0cd0f52b2d05edefaf9a17aa828647a"} Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.024444 4851 scope.go:117] "RemoveContainer" containerID="1566f50774133ea95c6a0e9c324d2e5bcf12901648eb5a468022250e57e983e9" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.024577 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b666754b-b52ns" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.028430 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sqnxr" event={"ID":"936e4810-725f-42c8-95a2-4e307b5b7af5","Type":"ContainerStarted","Data":"01dcb7d422ff5b8ba9c089696042c7c0f702efda829e7d765a133178451a8d6b"} Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.044955 4851 generic.go:334] "Generic (PLEG): container finished" podID="556db500-a192-4dfa-8a7f-4ab1d1e7e15b" containerID="3c62828291e0d54dc121908d792d7badaec5beb089700b9e32d232ebbc2d7ccd" exitCode=0 Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.045009 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"556db500-a192-4dfa-8a7f-4ab1d1e7e15b","Type":"ContainerDied","Data":"3c62828291e0d54dc121908d792d7badaec5beb089700b9e32d232ebbc2d7ccd"} Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.062475 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1db63449-71cb-4fa2-86db-43a83a914643-config-data\") pod \"1db63449-71cb-4fa2-86db-43a83a914643\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.062536 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-horizon-secret-key\") pod \"1db63449-71cb-4fa2-86db-43a83a914643\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.063345 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8v56\" (UniqueName: \"kubernetes.io/projected/1db63449-71cb-4fa2-86db-43a83a914643-kube-api-access-l8v56\") pod \"1db63449-71cb-4fa2-86db-43a83a914643\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.063459 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-combined-ca-bundle\") pod \"1db63449-71cb-4fa2-86db-43a83a914643\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.063478 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1db63449-71cb-4fa2-86db-43a83a914643-scripts\") pod \"1db63449-71cb-4fa2-86db-43a83a914643\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.063602 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-horizon-tls-certs\") pod \"1db63449-71cb-4fa2-86db-43a83a914643\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.063639 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1db63449-71cb-4fa2-86db-43a83a914643-logs\") pod \"1db63449-71cb-4fa2-86db-43a83a914643\" (UID: \"1db63449-71cb-4fa2-86db-43a83a914643\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.064799 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1db63449-71cb-4fa2-86db-43a83a914643-logs" (OuterVolumeSpecName: "logs") pod "1db63449-71cb-4fa2-86db-43a83a914643" (UID: "1db63449-71cb-4fa2-86db-43a83a914643"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.072240 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1db63449-71cb-4fa2-86db-43a83a914643-kube-api-access-l8v56" (OuterVolumeSpecName: "kube-api-access-l8v56") pod "1db63449-71cb-4fa2-86db-43a83a914643" (UID: "1db63449-71cb-4fa2-86db-43a83a914643"). InnerVolumeSpecName "kube-api-access-l8v56". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.074759 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7909b1e5-aeb3-42a4-85d0-4d3c142fab35","Type":"ContainerStarted","Data":"0c780ef1d09c95cce3974f0e90e1b5c259f6f24de061c7126ed0c371da4430aa"} Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.074795 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7909b1e5-aeb3-42a4-85d0-4d3c142fab35","Type":"ContainerStarted","Data":"0db66f6c2154e146023835b0a1a8bc7be7b3820dcc8368d32ff9648c28343b3b"} Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.075840 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1db63449-71cb-4fa2-86db-43a83a914643" (UID: "1db63449-71cb-4fa2-86db-43a83a914643"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.092264 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="556db500-a192-4dfa-8a7f-4ab1d1e7e15b" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.166:9292/healthcheck\": dial tcp 10.217.0.166:9292: connect: connection refused" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.092558 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="556db500-a192-4dfa-8a7f-4ab1d1e7e15b" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.166:9292/healthcheck\": dial tcp 10.217.0.166:9292: connect: connection refused" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.162965 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-sd7q4"] Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.171783 4851 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.171813 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8v56\" (UniqueName: \"kubernetes.io/projected/1db63449-71cb-4fa2-86db-43a83a914643-kube-api-access-l8v56\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.171822 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1db63449-71cb-4fa2-86db-43a83a914643-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.179799 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1db63449-71cb-4fa2-86db-43a83a914643-scripts" (OuterVolumeSpecName: "scripts") pod "1db63449-71cb-4fa2-86db-43a83a914643" (UID: "1db63449-71cb-4fa2-86db-43a83a914643"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.189400 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-cchtl"] Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.222233 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1db63449-71cb-4fa2-86db-43a83a914643-config-data" (OuterVolumeSpecName: "config-data") pod "1db63449-71cb-4fa2-86db-43a83a914643" (UID: "1db63449-71cb-4fa2-86db-43a83a914643"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.269779 4851 scope.go:117] "RemoveContainer" containerID="c39866bdf066b1da181c5a3017d9775a7100cc6f4794767d4519b16f3085d3af" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.272032 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "1db63449-71cb-4fa2-86db-43a83a914643" (UID: "1db63449-71cb-4fa2-86db-43a83a914643"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.273198 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1db63449-71cb-4fa2-86db-43a83a914643-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.273219 4851 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.273230 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1db63449-71cb-4fa2-86db-43a83a914643-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.311152 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1db63449-71cb-4fa2-86db-43a83a914643" (UID: "1db63449-71cb-4fa2-86db-43a83a914643"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.314679 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:11 crc kubenswrapper[4851]: W1001 13:14:11.360691 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4eb25592_152d_4ad6_9fcc_46efaee35645.slice/crio-0fa80fa28b24a6242dccf7082b626493b15a2849179428bb2c34cd138f95abb7 WatchSource:0}: Error finding container 0fa80fa28b24a6242dccf7082b626493b15a2849179428bb2c34cd138f95abb7: Status 404 returned error can't find the container with id 0fa80fa28b24a6242dccf7082b626493b15a2849179428bb2c34cd138f95abb7 Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.389730 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db63449-71cb-4fa2-86db-43a83a914643-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.476592 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67b666754b-b52ns"] Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.489653 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67b666754b-b52ns"] Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.493577 4851 scope.go:117] "RemoveContainer" containerID="1566f50774133ea95c6a0e9c324d2e5bcf12901648eb5a468022250e57e983e9" Oct 01 13:14:11 crc kubenswrapper[4851]: E1001 13:14:11.502192 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1566f50774133ea95c6a0e9c324d2e5bcf12901648eb5a468022250e57e983e9\": container with ID starting with 1566f50774133ea95c6a0e9c324d2e5bcf12901648eb5a468022250e57e983e9 not found: ID does not exist" containerID="1566f50774133ea95c6a0e9c324d2e5bcf12901648eb5a468022250e57e983e9" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.502237 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1566f50774133ea95c6a0e9c324d2e5bcf12901648eb5a468022250e57e983e9"} err="failed to get container status \"1566f50774133ea95c6a0e9c324d2e5bcf12901648eb5a468022250e57e983e9\": rpc error: code = NotFound desc = could not find container \"1566f50774133ea95c6a0e9c324d2e5bcf12901648eb5a468022250e57e983e9\": container with ID starting with 1566f50774133ea95c6a0e9c324d2e5bcf12901648eb5a468022250e57e983e9 not found: ID does not exist" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.502260 4851 scope.go:117] "RemoveContainer" containerID="c39866bdf066b1da181c5a3017d9775a7100cc6f4794767d4519b16f3085d3af" Oct 01 13:14:11 crc kubenswrapper[4851]: E1001 13:14:11.504849 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c39866bdf066b1da181c5a3017d9775a7100cc6f4794767d4519b16f3085d3af\": container with ID starting with c39866bdf066b1da181c5a3017d9775a7100cc6f4794767d4519b16f3085d3af not found: ID does not exist" containerID="c39866bdf066b1da181c5a3017d9775a7100cc6f4794767d4519b16f3085d3af" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.504880 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c39866bdf066b1da181c5a3017d9775a7100cc6f4794767d4519b16f3085d3af"} err="failed to get container status \"c39866bdf066b1da181c5a3017d9775a7100cc6f4794767d4519b16f3085d3af\": rpc error: code = NotFound desc = could not find container \"c39866bdf066b1da181c5a3017d9775a7100cc6f4794767d4519b16f3085d3af\": container with ID starting with c39866bdf066b1da181c5a3017d9775a7100cc6f4794767d4519b16f3085d3af not found: ID does not exist" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.776677 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.911443 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-config-data\") pod \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.911469 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44hbg\" (UniqueName: \"kubernetes.io/projected/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-kube-api-access-44hbg\") pod \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.911561 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.911584 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-combined-ca-bundle\") pod \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.911634 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-logs\") pod \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.911798 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-httpd-run\") pod \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.911817 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-scripts\") pod \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.911836 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-internal-tls-certs\") pod \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\" (UID: \"556db500-a192-4dfa-8a7f-4ab1d1e7e15b\") " Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.920476 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-logs" (OuterVolumeSpecName: "logs") pod "556db500-a192-4dfa-8a7f-4ab1d1e7e15b" (UID: "556db500-a192-4dfa-8a7f-4ab1d1e7e15b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.920658 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "556db500-a192-4dfa-8a7f-4ab1d1e7e15b" (UID: "556db500-a192-4dfa-8a7f-4ab1d1e7e15b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.931995 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-scripts" (OuterVolumeSpecName: "scripts") pod "556db500-a192-4dfa-8a7f-4ab1d1e7e15b" (UID: "556db500-a192-4dfa-8a7f-4ab1d1e7e15b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.932117 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "556db500-a192-4dfa-8a7f-4ab1d1e7e15b" (UID: "556db500-a192-4dfa-8a7f-4ab1d1e7e15b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.955683 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-kube-api-access-44hbg" (OuterVolumeSpecName: "kube-api-access-44hbg") pod "556db500-a192-4dfa-8a7f-4ab1d1e7e15b" (UID: "556db500-a192-4dfa-8a7f-4ab1d1e7e15b"). InnerVolumeSpecName "kube-api-access-44hbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.958658 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "556db500-a192-4dfa-8a7f-4ab1d1e7e15b" (UID: "556db500-a192-4dfa-8a7f-4ab1d1e7e15b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:11 crc kubenswrapper[4851]: I1001 13:14:11.977819 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.013696 4851 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.014025 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.014035 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44hbg\" (UniqueName: \"kubernetes.io/projected/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-kube-api-access-44hbg\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.014045 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.014066 4851 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.014074 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.033807 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-config-data" (OuterVolumeSpecName: "config-data") pod "556db500-a192-4dfa-8a7f-4ab1d1e7e15b" (UID: "556db500-a192-4dfa-8a7f-4ab1d1e7e15b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.104073 4851 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.114640 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"39c1516d-1f7f-4814-b712-cfa3355ded27\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.114685 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-public-tls-certs\") pod \"39c1516d-1f7f-4814-b712-cfa3355ded27\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.114732 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39c1516d-1f7f-4814-b712-cfa3355ded27-httpd-run\") pod \"39c1516d-1f7f-4814-b712-cfa3355ded27\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.114803 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-combined-ca-bundle\") pod \"39c1516d-1f7f-4814-b712-cfa3355ded27\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.114836 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39c1516d-1f7f-4814-b712-cfa3355ded27-logs\") pod \"39c1516d-1f7f-4814-b712-cfa3355ded27\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.114898 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-config-data\") pod \"39c1516d-1f7f-4814-b712-cfa3355ded27\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.114951 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-scripts\") pod \"39c1516d-1f7f-4814-b712-cfa3355ded27\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.114987 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmw5w\" (UniqueName: \"kubernetes.io/projected/39c1516d-1f7f-4814-b712-cfa3355ded27-kube-api-access-zmw5w\") pod \"39c1516d-1f7f-4814-b712-cfa3355ded27\" (UID: \"39c1516d-1f7f-4814-b712-cfa3355ded27\") " Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.115386 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.115398 4851 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.120350 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39c1516d-1f7f-4814-b712-cfa3355ded27-logs" (OuterVolumeSpecName: "logs") pod "39c1516d-1f7f-4814-b712-cfa3355ded27" (UID: "39c1516d-1f7f-4814-b712-cfa3355ded27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.120527 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39c1516d-1f7f-4814-b712-cfa3355ded27-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "39c1516d-1f7f-4814-b712-cfa3355ded27" (UID: "39c1516d-1f7f-4814-b712-cfa3355ded27"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.122599 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c1516d-1f7f-4814-b712-cfa3355ded27-kube-api-access-zmw5w" (OuterVolumeSpecName: "kube-api-access-zmw5w") pod "39c1516d-1f7f-4814-b712-cfa3355ded27" (UID: "39c1516d-1f7f-4814-b712-cfa3355ded27"). InnerVolumeSpecName "kube-api-access-zmw5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.123515 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "556db500-a192-4dfa-8a7f-4ab1d1e7e15b" (UID: "556db500-a192-4dfa-8a7f-4ab1d1e7e15b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.123945 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-scripts" (OuterVolumeSpecName: "scripts") pod "39c1516d-1f7f-4814-b712-cfa3355ded27" (UID: "39c1516d-1f7f-4814-b712-cfa3355ded27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.124094 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "39c1516d-1f7f-4814-b712-cfa3355ded27" (UID: "39c1516d-1f7f-4814-b712-cfa3355ded27"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.132294 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sd7q4" event={"ID":"3d926f8f-83f5-470e-9dba-d2b3478d9dae","Type":"ContainerStarted","Data":"60bf57a00bf28e2e0867c6e2e58142fcce14ec0f60100711fd8fd8ee0e933904"} Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.132343 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sd7q4" event={"ID":"3d926f8f-83f5-470e-9dba-d2b3478d9dae","Type":"ContainerStarted","Data":"6a11c92588c72d57482d25a4a128008db6299ab276fec02539e84015e6147a0f"} Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.147219 4851 generic.go:334] "Generic (PLEG): container finished" podID="4eb25592-152d-4ad6-9fcc-46efaee35645" containerID="99b4ea3c0fcd4ea0a7809a3b999003e995bb59a8ca7fef21a8aaae20873e59b2" exitCode=0 Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.147315 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-cchtl" event={"ID":"4eb25592-152d-4ad6-9fcc-46efaee35645","Type":"ContainerDied","Data":"99b4ea3c0fcd4ea0a7809a3b999003e995bb59a8ca7fef21a8aaae20873e59b2"} Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.147340 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-cchtl" event={"ID":"4eb25592-152d-4ad6-9fcc-46efaee35645","Type":"ContainerStarted","Data":"0fa80fa28b24a6242dccf7082b626493b15a2849179428bb2c34cd138f95abb7"} Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.149668 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"556db500-a192-4dfa-8a7f-4ab1d1e7e15b","Type":"ContainerDied","Data":"572b4c97eb7a514af19a21b9fb1ee8427107ac82603d916c74fcc825659e3f41"} Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.149703 4851 scope.go:117] "RemoveContainer" containerID="3c62828291e0d54dc121908d792d7badaec5beb089700b9e32d232ebbc2d7ccd" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.149814 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.160053 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7909b1e5-aeb3-42a4-85d0-4d3c142fab35","Type":"ContainerStarted","Data":"13044738ec35137f45762e16074e74ac068396b8c20c1d1ae74ad1216f8edbad"} Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.185064 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.185090 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"39c1516d-1f7f-4814-b712-cfa3355ded27","Type":"ContainerDied","Data":"87498b7b2cb915c69f1b7f4b83154dc6382ada6e87d20f2ed02fbcc90b02aad6"} Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.194440 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "39c1516d-1f7f-4814-b712-cfa3355ded27" (UID: "39c1516d-1f7f-4814-b712-cfa3355ded27"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.205243 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39c1516d-1f7f-4814-b712-cfa3355ded27" (UID: "39c1516d-1f7f-4814-b712-cfa3355ded27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.205254 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-config-data" (OuterVolumeSpecName: "config-data") pod "39c1516d-1f7f-4814-b712-cfa3355ded27" (UID: "39c1516d-1f7f-4814-b712-cfa3355ded27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.195611 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-sd7q4" podStartSLOduration=3.195592547 podStartE2EDuration="3.195592547s" podCreationTimestamp="2025-10-01 13:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:14:12.173037873 +0000 UTC m=+1260.518155359" watchObservedRunningTime="2025-10-01 13:14:12.195592547 +0000 UTC m=+1260.540710033" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.217196 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.217244 4851 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/556db500-a192-4dfa-8a7f-4ab1d1e7e15b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.217261 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39c1516d-1f7f-4814-b712-cfa3355ded27-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.217277 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.217289 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.217301 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmw5w\" (UniqueName: \"kubernetes.io/projected/39c1516d-1f7f-4814-b712-cfa3355ded27-kube-api-access-zmw5w\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.217344 4851 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.217356 4851 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c1516d-1f7f-4814-b712-cfa3355ded27-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.217367 4851 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39c1516d-1f7f-4814-b712-cfa3355ded27-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.243023 4851 generic.go:334] "Generic (PLEG): container finished" podID="936e4810-725f-42c8-95a2-4e307b5b7af5" containerID="b4aa968fda4aaa434ad0dfbaced946667eea5f12c8a9302752650647eaa06653" exitCode=0 Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.243072 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sqnxr" event={"ID":"936e4810-725f-42c8-95a2-4e307b5b7af5","Type":"ContainerDied","Data":"b4aa968fda4aaa434ad0dfbaced946667eea5f12c8a9302752650647eaa06653"} Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.277129 4851 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.318951 4851 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.332699 4851 scope.go:117] "RemoveContainer" containerID="e64190c7410f428cda3cb53f01f5f24b91b5ecd57305ce3aaaffc33f3c6acb7e" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.351203 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1db63449-71cb-4fa2-86db-43a83a914643" path="/var/lib/kubelet/pods/1db63449-71cb-4fa2-86db-43a83a914643/volumes" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.352127 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.352151 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.364589 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:14:12 crc kubenswrapper[4851]: E1001 13:14:12.366362 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db63449-71cb-4fa2-86db-43a83a914643" containerName="horizon" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.366397 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db63449-71cb-4fa2-86db-43a83a914643" containerName="horizon" Oct 01 13:14:12 crc kubenswrapper[4851]: E1001 13:14:12.366417 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db63449-71cb-4fa2-86db-43a83a914643" containerName="horizon-log" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.366424 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db63449-71cb-4fa2-86db-43a83a914643" containerName="horizon-log" Oct 01 13:14:12 crc kubenswrapper[4851]: E1001 13:14:12.366434 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556db500-a192-4dfa-8a7f-4ab1d1e7e15b" containerName="glance-httpd" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.366440 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="556db500-a192-4dfa-8a7f-4ab1d1e7e15b" containerName="glance-httpd" Oct 01 13:14:12 crc kubenswrapper[4851]: E1001 13:14:12.366457 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c1516d-1f7f-4814-b712-cfa3355ded27" containerName="glance-httpd" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.366475 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c1516d-1f7f-4814-b712-cfa3355ded27" containerName="glance-httpd" Oct 01 13:14:12 crc kubenswrapper[4851]: E1001 13:14:12.366483 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556db500-a192-4dfa-8a7f-4ab1d1e7e15b" containerName="glance-log" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.366488 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="556db500-a192-4dfa-8a7f-4ab1d1e7e15b" containerName="glance-log" Oct 01 13:14:12 crc kubenswrapper[4851]: E1001 13:14:12.366524 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c1516d-1f7f-4814-b712-cfa3355ded27" containerName="glance-log" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.366531 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c1516d-1f7f-4814-b712-cfa3355ded27" containerName="glance-log" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.366794 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="1db63449-71cb-4fa2-86db-43a83a914643" containerName="horizon" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.366827 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c1516d-1f7f-4814-b712-cfa3355ded27" containerName="glance-log" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.366838 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="556db500-a192-4dfa-8a7f-4ab1d1e7e15b" containerName="glance-httpd" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.366847 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c1516d-1f7f-4814-b712-cfa3355ded27" containerName="glance-httpd" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.366862 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="1db63449-71cb-4fa2-86db-43a83a914643" containerName="horizon-log" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.366881 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="556db500-a192-4dfa-8a7f-4ab1d1e7e15b" containerName="glance-log" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.368220 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.371696 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.371859 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.379813 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.385041 4851 scope.go:117] "RemoveContainer" containerID="bf2cda50c2b2b074f7d44b4f289e0e262020b11056db7c66dad943b5eed457fb" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.429884 4851 scope.go:117] "RemoveContainer" containerID="81d8977f210e5feb5373c99f20ced4dbc1d5da428cccf180185431132777899a" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.506924 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.514483 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.526566 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67rr7\" (UniqueName: \"kubernetes.io/projected/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-kube-api-access-67rr7\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.526711 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.526805 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.526894 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.527145 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.527194 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-logs\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.527267 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.527528 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.530279 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.542806 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.542900 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.546065 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.546224 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.630267 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.630356 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.630958 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631005 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-logs\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631028 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67rr7\" (UniqueName: \"kubernetes.io/projected/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-kube-api-access-67rr7\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631068 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631100 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631125 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631147 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631171 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631192 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631239 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfgnq\" (UniqueName: \"kubernetes.io/projected/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-kube-api-access-pfgnq\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631262 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631279 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-logs\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631315 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631335 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631928 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-logs\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.631946 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.632155 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.639082 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.647364 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.647846 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.651026 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.668137 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67rr7\" (UniqueName: \"kubernetes.io/projected/41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0-kube-api-access-67rr7\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.678237 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.729567 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.732680 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.732770 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.732810 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-logs\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.732852 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.733276 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-logs\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.733726 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.733768 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.733847 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfgnq\" (UniqueName: \"kubernetes.io/projected/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-kube-api-access-pfgnq\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.733989 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.734776 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.735070 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.740153 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.741878 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.744186 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.747268 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.786878 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfgnq\" (UniqueName: \"kubernetes.io/projected/ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c-kube-api-access-pfgnq\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.792096 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c\") " pod="openstack/glance-default-external-api-0" Oct 01 13:14:12 crc kubenswrapper[4851]: I1001 13:14:12.862575 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:14:13 crc kubenswrapper[4851]: I1001 13:14:13.255085 4851 generic.go:334] "Generic (PLEG): container finished" podID="3d926f8f-83f5-470e-9dba-d2b3478d9dae" containerID="60bf57a00bf28e2e0867c6e2e58142fcce14ec0f60100711fd8fd8ee0e933904" exitCode=0 Oct 01 13:14:13 crc kubenswrapper[4851]: I1001 13:14:13.255470 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sd7q4" event={"ID":"3d926f8f-83f5-470e-9dba-d2b3478d9dae","Type":"ContainerDied","Data":"60bf57a00bf28e2e0867c6e2e58142fcce14ec0f60100711fd8fd8ee0e933904"} Oct 01 13:14:13 crc kubenswrapper[4851]: I1001 13:14:13.267260 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7909b1e5-aeb3-42a4-85d0-4d3c142fab35","Type":"ContainerStarted","Data":"6f68efdbd0237b97cf6a4c1fe0a1b4e696696562807dac83f5933afc5b69b61f"} Oct 01 13:14:13 crc kubenswrapper[4851]: I1001 13:14:13.414591 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:14:13 crc kubenswrapper[4851]: I1001 13:14:13.573981 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:14:13 crc kubenswrapper[4851]: W1001 13:14:13.603687 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff6e380f_fa95_4b19_b3ed_26f7c9a8f47c.slice/crio-ab3e08326c693ec0c937b3a82fc3b7bda22a2c9880177ad2dc98266e5291f9df WatchSource:0}: Error finding container ab3e08326c693ec0c937b3a82fc3b7bda22a2c9880177ad2dc98266e5291f9df: Status 404 returned error can't find the container with id ab3e08326c693ec0c937b3a82fc3b7bda22a2c9880177ad2dc98266e5291f9df Oct 01 13:14:13 crc kubenswrapper[4851]: I1001 13:14:13.772888 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sqnxr" Oct 01 13:14:13 crc kubenswrapper[4851]: I1001 13:14:13.864887 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-cchtl" Oct 01 13:14:13 crc kubenswrapper[4851]: I1001 13:14:13.867706 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rftb\" (UniqueName: \"kubernetes.io/projected/936e4810-725f-42c8-95a2-4e307b5b7af5-kube-api-access-4rftb\") pod \"936e4810-725f-42c8-95a2-4e307b5b7af5\" (UID: \"936e4810-725f-42c8-95a2-4e307b5b7af5\") " Oct 01 13:14:13 crc kubenswrapper[4851]: I1001 13:14:13.880652 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/936e4810-725f-42c8-95a2-4e307b5b7af5-kube-api-access-4rftb" (OuterVolumeSpecName: "kube-api-access-4rftb") pod "936e4810-725f-42c8-95a2-4e307b5b7af5" (UID: "936e4810-725f-42c8-95a2-4e307b5b7af5"). InnerVolumeSpecName "kube-api-access-4rftb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:13 crc kubenswrapper[4851]: I1001 13:14:13.970090 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9fc6\" (UniqueName: \"kubernetes.io/projected/4eb25592-152d-4ad6-9fcc-46efaee35645-kube-api-access-w9fc6\") pod \"4eb25592-152d-4ad6-9fcc-46efaee35645\" (UID: \"4eb25592-152d-4ad6-9fcc-46efaee35645\") " Oct 01 13:14:13 crc kubenswrapper[4851]: I1001 13:14:13.970758 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rftb\" (UniqueName: \"kubernetes.io/projected/936e4810-725f-42c8-95a2-4e307b5b7af5-kube-api-access-4rftb\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:13 crc kubenswrapper[4851]: I1001 13:14:13.973742 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eb25592-152d-4ad6-9fcc-46efaee35645-kube-api-access-w9fc6" (OuterVolumeSpecName: "kube-api-access-w9fc6") pod "4eb25592-152d-4ad6-9fcc-46efaee35645" (UID: "4eb25592-152d-4ad6-9fcc-46efaee35645"). InnerVolumeSpecName "kube-api-access-w9fc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:14 crc kubenswrapper[4851]: I1001 13:14:14.071972 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9fc6\" (UniqueName: \"kubernetes.io/projected/4eb25592-152d-4ad6-9fcc-46efaee35645-kube-api-access-w9fc6\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:14 crc kubenswrapper[4851]: I1001 13:14:14.298340 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-cchtl" event={"ID":"4eb25592-152d-4ad6-9fcc-46efaee35645","Type":"ContainerDied","Data":"0fa80fa28b24a6242dccf7082b626493b15a2849179428bb2c34cd138f95abb7"} Oct 01 13:14:14 crc kubenswrapper[4851]: I1001 13:14:14.298588 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-cchtl" Oct 01 13:14:14 crc kubenswrapper[4851]: I1001 13:14:14.298595 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fa80fa28b24a6242dccf7082b626493b15a2849179428bb2c34cd138f95abb7" Oct 01 13:14:14 crc kubenswrapper[4851]: I1001 13:14:14.305749 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sqnxr" Oct 01 13:14:14 crc kubenswrapper[4851]: I1001 13:14:14.305759 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sqnxr" event={"ID":"936e4810-725f-42c8-95a2-4e307b5b7af5","Type":"ContainerDied","Data":"01dcb7d422ff5b8ba9c089696042c7c0f702efda829e7d765a133178451a8d6b"} Oct 01 13:14:14 crc kubenswrapper[4851]: I1001 13:14:14.305798 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01dcb7d422ff5b8ba9c089696042c7c0f702efda829e7d765a133178451a8d6b" Oct 01 13:14:14 crc kubenswrapper[4851]: I1001 13:14:14.307385 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c","Type":"ContainerStarted","Data":"ab3e08326c693ec0c937b3a82fc3b7bda22a2c9880177ad2dc98266e5291f9df"} Oct 01 13:14:14 crc kubenswrapper[4851]: I1001 13:14:14.309323 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0","Type":"ContainerStarted","Data":"95c9ffb73d9f08d2c5a749e81f3993781ab52e7fc695e8ec5eabfb7ee2fd30a6"} Oct 01 13:14:14 crc kubenswrapper[4851]: I1001 13:14:14.348126 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39c1516d-1f7f-4814-b712-cfa3355ded27" path="/var/lib/kubelet/pods/39c1516d-1f7f-4814-b712-cfa3355ded27/volumes" Oct 01 13:14:14 crc kubenswrapper[4851]: I1001 13:14:14.349131 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="556db500-a192-4dfa-8a7f-4ab1d1e7e15b" path="/var/lib/kubelet/pods/556db500-a192-4dfa-8a7f-4ab1d1e7e15b/volumes" Oct 01 13:14:14 crc kubenswrapper[4851]: I1001 13:14:14.821378 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sd7q4" Oct 01 13:14:14 crc kubenswrapper[4851]: I1001 13:14:14.898084 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnhzs\" (UniqueName: \"kubernetes.io/projected/3d926f8f-83f5-470e-9dba-d2b3478d9dae-kube-api-access-dnhzs\") pod \"3d926f8f-83f5-470e-9dba-d2b3478d9dae\" (UID: \"3d926f8f-83f5-470e-9dba-d2b3478d9dae\") " Oct 01 13:14:14 crc kubenswrapper[4851]: I1001 13:14:14.906134 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d926f8f-83f5-470e-9dba-d2b3478d9dae-kube-api-access-dnhzs" (OuterVolumeSpecName: "kube-api-access-dnhzs") pod "3d926f8f-83f5-470e-9dba-d2b3478d9dae" (UID: "3d926f8f-83f5-470e-9dba-d2b3478d9dae"). InnerVolumeSpecName "kube-api-access-dnhzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:15 crc kubenswrapper[4851]: I1001 13:14:15.000380 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnhzs\" (UniqueName: \"kubernetes.io/projected/3d926f8f-83f5-470e-9dba-d2b3478d9dae-kube-api-access-dnhzs\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:15 crc kubenswrapper[4851]: I1001 13:14:15.328055 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sd7q4" event={"ID":"3d926f8f-83f5-470e-9dba-d2b3478d9dae","Type":"ContainerDied","Data":"6a11c92588c72d57482d25a4a128008db6299ab276fec02539e84015e6147a0f"} Oct 01 13:14:15 crc kubenswrapper[4851]: I1001 13:14:15.328105 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a11c92588c72d57482d25a4a128008db6299ab276fec02539e84015e6147a0f" Oct 01 13:14:15 crc kubenswrapper[4851]: I1001 13:14:15.328221 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sd7q4" Oct 01 13:14:15 crc kubenswrapper[4851]: I1001 13:14:15.346101 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0","Type":"ContainerStarted","Data":"f9e2549781e6acffe82d03f5b27405ea28c4745501884c5fcaeb292ddf99c30d"} Oct 01 13:14:15 crc kubenswrapper[4851]: I1001 13:14:15.358121 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c","Type":"ContainerStarted","Data":"cb460e6ec778935dcb13cae66e4106e8b114410406b4d62abf0c4ab3021436d3"} Oct 01 13:14:15 crc kubenswrapper[4851]: I1001 13:14:15.362875 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7909b1e5-aeb3-42a4-85d0-4d3c142fab35","Type":"ContainerStarted","Data":"80fb442d0f64203d1cafcc4a1acef5d34a900b62f1b35fdbde40e0bb597dab56"} Oct 01 13:14:15 crc kubenswrapper[4851]: I1001 13:14:15.363056 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="ceilometer-central-agent" containerID="cri-o://0c780ef1d09c95cce3974f0e90e1b5c259f6f24de061c7126ed0c371da4430aa" gracePeriod=30 Oct 01 13:14:15 crc kubenswrapper[4851]: I1001 13:14:15.363364 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:14:15 crc kubenswrapper[4851]: I1001 13:14:15.363632 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="proxy-httpd" containerID="cri-o://80fb442d0f64203d1cafcc4a1acef5d34a900b62f1b35fdbde40e0bb597dab56" gracePeriod=30 Oct 01 13:14:15 crc kubenswrapper[4851]: I1001 13:14:15.363666 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="ceilometer-notification-agent" containerID="cri-o://13044738ec35137f45762e16074e74ac068396b8c20c1d1ae74ad1216f8edbad" gracePeriod=30 Oct 01 13:14:15 crc kubenswrapper[4851]: I1001 13:14:15.363709 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="sg-core" containerID="cri-o://6f68efdbd0237b97cf6a4c1fe0a1b4e696696562807dac83f5933afc5b69b61f" gracePeriod=30 Oct 01 13:14:15 crc kubenswrapper[4851]: I1001 13:14:15.387538 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.928248531 podStartE2EDuration="6.387519161s" podCreationTimestamp="2025-10-01 13:14:09 +0000 UTC" firstStartedPulling="2025-10-01 13:14:10.306249706 +0000 UTC m=+1258.651367192" lastFinishedPulling="2025-10-01 13:14:14.765520336 +0000 UTC m=+1263.110637822" observedRunningTime="2025-10-01 13:14:15.384029672 +0000 UTC m=+1263.729147158" watchObservedRunningTime="2025-10-01 13:14:15.387519161 +0000 UTC m=+1263.732636647" Oct 01 13:14:16 crc kubenswrapper[4851]: I1001 13:14:16.374074 4851 generic.go:334] "Generic (PLEG): container finished" podID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerID="80fb442d0f64203d1cafcc4a1acef5d34a900b62f1b35fdbde40e0bb597dab56" exitCode=0 Oct 01 13:14:16 crc kubenswrapper[4851]: I1001 13:14:16.374442 4851 generic.go:334] "Generic (PLEG): container finished" podID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerID="6f68efdbd0237b97cf6a4c1fe0a1b4e696696562807dac83f5933afc5b69b61f" exitCode=2 Oct 01 13:14:16 crc kubenswrapper[4851]: I1001 13:14:16.374461 4851 generic.go:334] "Generic (PLEG): container finished" podID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerID="13044738ec35137f45762e16074e74ac068396b8c20c1d1ae74ad1216f8edbad" exitCode=0 Oct 01 13:14:16 crc kubenswrapper[4851]: I1001 13:14:16.374152 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7909b1e5-aeb3-42a4-85d0-4d3c142fab35","Type":"ContainerDied","Data":"80fb442d0f64203d1cafcc4a1acef5d34a900b62f1b35fdbde40e0bb597dab56"} Oct 01 13:14:16 crc kubenswrapper[4851]: I1001 13:14:16.374593 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7909b1e5-aeb3-42a4-85d0-4d3c142fab35","Type":"ContainerDied","Data":"6f68efdbd0237b97cf6a4c1fe0a1b4e696696562807dac83f5933afc5b69b61f"} Oct 01 13:14:16 crc kubenswrapper[4851]: I1001 13:14:16.374606 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7909b1e5-aeb3-42a4-85d0-4d3c142fab35","Type":"ContainerDied","Data":"13044738ec35137f45762e16074e74ac068396b8c20c1d1ae74ad1216f8edbad"} Oct 01 13:14:16 crc kubenswrapper[4851]: I1001 13:14:16.376606 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0","Type":"ContainerStarted","Data":"57d87f23bbad26a1233e79a8481870cd5b82b652bc34eab887c2882acc5886c0"} Oct 01 13:14:16 crc kubenswrapper[4851]: I1001 13:14:16.384566 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c","Type":"ContainerStarted","Data":"939c2199086277ff8669191190fa338c5a5ea3daa330584f894ea3146e30d3b9"} Oct 01 13:14:16 crc kubenswrapper[4851]: I1001 13:14:16.410636 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.410597016 podStartE2EDuration="4.410597016s" podCreationTimestamp="2025-10-01 13:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:14:16.399916771 +0000 UTC m=+1264.745034257" watchObservedRunningTime="2025-10-01 13:14:16.410597016 +0000 UTC m=+1264.755714512" Oct 01 13:14:16 crc kubenswrapper[4851]: I1001 13:14:16.426884 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.4268654210000005 podStartE2EDuration="4.426865421s" podCreationTimestamp="2025-10-01 13:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:14:16.418320997 +0000 UTC m=+1264.763438483" watchObservedRunningTime="2025-10-01 13:14:16.426865421 +0000 UTC m=+1264.771982907" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.396640 4851 generic.go:334] "Generic (PLEG): container finished" podID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerID="0c780ef1d09c95cce3974f0e90e1b5c259f6f24de061c7126ed0c371da4430aa" exitCode=0 Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.396716 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7909b1e5-aeb3-42a4-85d0-4d3c142fab35","Type":"ContainerDied","Data":"0c780ef1d09c95cce3974f0e90e1b5c259f6f24de061c7126ed0c371da4430aa"} Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.397038 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7909b1e5-aeb3-42a4-85d0-4d3c142fab35","Type":"ContainerDied","Data":"0db66f6c2154e146023835b0a1a8bc7be7b3820dcc8368d32ff9648c28343b3b"} Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.397052 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0db66f6c2154e146023835b0a1a8bc7be7b3820dcc8368d32ff9648c28343b3b" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.401121 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.454085 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-sg-core-conf-yaml\") pod \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.454144 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-scripts\") pod \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.454230 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8m2h\" (UniqueName: \"kubernetes.io/projected/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-kube-api-access-l8m2h\") pod \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.454246 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-combined-ca-bundle\") pod \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.454310 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-run-httpd\") pod \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.454364 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-config-data\") pod \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.454402 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-log-httpd\") pod \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\" (UID: \"7909b1e5-aeb3-42a4-85d0-4d3c142fab35\") " Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.454970 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7909b1e5-aeb3-42a4-85d0-4d3c142fab35" (UID: "7909b1e5-aeb3-42a4-85d0-4d3c142fab35"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.455136 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7909b1e5-aeb3-42a4-85d0-4d3c142fab35" (UID: "7909b1e5-aeb3-42a4-85d0-4d3c142fab35"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.460900 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-kube-api-access-l8m2h" (OuterVolumeSpecName: "kube-api-access-l8m2h") pod "7909b1e5-aeb3-42a4-85d0-4d3c142fab35" (UID: "7909b1e5-aeb3-42a4-85d0-4d3c142fab35"). InnerVolumeSpecName "kube-api-access-l8m2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.464370 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-scripts" (OuterVolumeSpecName: "scripts") pod "7909b1e5-aeb3-42a4-85d0-4d3c142fab35" (UID: "7909b1e5-aeb3-42a4-85d0-4d3c142fab35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.494562 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7909b1e5-aeb3-42a4-85d0-4d3c142fab35" (UID: "7909b1e5-aeb3-42a4-85d0-4d3c142fab35"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.552335 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7909b1e5-aeb3-42a4-85d0-4d3c142fab35" (UID: "7909b1e5-aeb3-42a4-85d0-4d3c142fab35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.556283 4851 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.556624 4851 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.556729 4851 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.556827 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.556928 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8m2h\" (UniqueName: \"kubernetes.io/projected/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-kube-api-access-l8m2h\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.557023 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.574848 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-config-data" (OuterVolumeSpecName: "config-data") pod "7909b1e5-aeb3-42a4-85d0-4d3c142fab35" (UID: "7909b1e5-aeb3-42a4-85d0-4d3c142fab35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:17 crc kubenswrapper[4851]: I1001 13:14:17.658983 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7909b1e5-aeb3-42a4-85d0-4d3c142fab35-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.132023 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.149201 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-b55c56cb7-m2hqv" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.408140 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.467281 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.476324 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.483410 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:18 crc kubenswrapper[4851]: E1001 13:14:18.483938 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d926f8f-83f5-470e-9dba-d2b3478d9dae" containerName="mariadb-database-create" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.483961 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d926f8f-83f5-470e-9dba-d2b3478d9dae" containerName="mariadb-database-create" Oct 01 13:14:18 crc kubenswrapper[4851]: E1001 13:14:18.483992 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb25592-152d-4ad6-9fcc-46efaee35645" containerName="mariadb-database-create" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.484002 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb25592-152d-4ad6-9fcc-46efaee35645" containerName="mariadb-database-create" Oct 01 13:14:18 crc kubenswrapper[4851]: E1001 13:14:18.484026 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="ceilometer-notification-agent" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.484035 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="ceilometer-notification-agent" Oct 01 13:14:18 crc kubenswrapper[4851]: E1001 13:14:18.484054 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="ceilometer-central-agent" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.484061 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="ceilometer-central-agent" Oct 01 13:14:18 crc kubenswrapper[4851]: E1001 13:14:18.484076 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="proxy-httpd" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.484084 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="proxy-httpd" Oct 01 13:14:18 crc kubenswrapper[4851]: E1001 13:14:18.484103 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="sg-core" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.484110 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="sg-core" Oct 01 13:14:18 crc kubenswrapper[4851]: E1001 13:14:18.484128 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="936e4810-725f-42c8-95a2-4e307b5b7af5" containerName="mariadb-database-create" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.484137 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="936e4810-725f-42c8-95a2-4e307b5b7af5" containerName="mariadb-database-create" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.484391 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="ceilometer-central-agent" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.484417 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="proxy-httpd" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.484432 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d926f8f-83f5-470e-9dba-d2b3478d9dae" containerName="mariadb-database-create" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.484449 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="936e4810-725f-42c8-95a2-4e307b5b7af5" containerName="mariadb-database-create" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.484463 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="ceilometer-notification-agent" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.484474 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" containerName="sg-core" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.484484 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb25592-152d-4ad6-9fcc-46efaee35645" containerName="mariadb-database-create" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.486706 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.490002 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.493150 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.494959 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.576368 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.576425 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-scripts\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.576462 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0158e7a7-afdc-402e-8822-7ab194243f8e-log-httpd\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.576555 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0158e7a7-afdc-402e-8822-7ab194243f8e-run-httpd\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.576593 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlcms\" (UniqueName: \"kubernetes.io/projected/0158e7a7-afdc-402e-8822-7ab194243f8e-kube-api-access-vlcms\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.576619 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.576705 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-config-data\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.678728 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0158e7a7-afdc-402e-8822-7ab194243f8e-run-httpd\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.678788 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlcms\" (UniqueName: \"kubernetes.io/projected/0158e7a7-afdc-402e-8822-7ab194243f8e-kube-api-access-vlcms\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.678851 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.679241 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0158e7a7-afdc-402e-8822-7ab194243f8e-run-httpd\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.679768 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-config-data\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.679896 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.679921 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-scripts\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.679936 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0158e7a7-afdc-402e-8822-7ab194243f8e-log-httpd\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.680227 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0158e7a7-afdc-402e-8822-7ab194243f8e-log-httpd\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.683911 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.684729 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.689290 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-scripts\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.695766 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-config-data\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.699147 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlcms\" (UniqueName: \"kubernetes.io/projected/0158e7a7-afdc-402e-8822-7ab194243f8e-kube-api-access-vlcms\") pod \"ceilometer-0\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " pod="openstack/ceilometer-0" Oct 01 13:14:18 crc kubenswrapper[4851]: I1001 13:14:18.813915 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:19 crc kubenswrapper[4851]: I1001 13:14:19.293351 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:19 crc kubenswrapper[4851]: I1001 13:14:19.328882 4851 scope.go:117] "RemoveContainer" containerID="c20a7b78f4412c0f21c222ebc0f47ef6d52387a25f64c9b0748dd8ea538d34b0" Oct 01 13:14:19 crc kubenswrapper[4851]: I1001 13:14:19.420057 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0158e7a7-afdc-402e-8822-7ab194243f8e","Type":"ContainerStarted","Data":"ad6615454cb0c5242f75e3a0daf1828d7fb27094fdf87ce141d3acd025194489"} Oct 01 13:14:20 crc kubenswrapper[4851]: I1001 13:14:20.340410 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7909b1e5-aeb3-42a4-85d0-4d3c142fab35" path="/var/lib/kubelet/pods/7909b1e5-aeb3-42a4-85d0-4d3c142fab35/volumes" Oct 01 13:14:20 crc kubenswrapper[4851]: I1001 13:14:20.446950 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"971ca0ac-6de7-42f1-bf29-5174fd80ced4","Type":"ContainerStarted","Data":"a68fbdceade2850c22adba2de4b6078e94625dcade81e98faf63ffd294964ee5"} Oct 01 13:14:20 crc kubenswrapper[4851]: I1001 13:14:20.449876 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0158e7a7-afdc-402e-8822-7ab194243f8e","Type":"ContainerStarted","Data":"07b6c171909ccf521028f5fe6ed6f90a0cc8f0ac59fa5d8abe35e2367a469f21"} Oct 01 13:14:20 crc kubenswrapper[4851]: I1001 13:14:20.449913 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0158e7a7-afdc-402e-8822-7ab194243f8e","Type":"ContainerStarted","Data":"44b1b979c9a39b7de714608c2e88bf7abc331eaef1e5f2195cd802a8885c3a23"} Oct 01 13:14:21 crc kubenswrapper[4851]: I1001 13:14:21.459043 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0158e7a7-afdc-402e-8822-7ab194243f8e","Type":"ContainerStarted","Data":"351a8782ce730dfc70ca3d3bbff35ee0417490abe9e8c745c832f5a8917fd981"} Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.469260 4851 generic.go:334] "Generic (PLEG): container finished" podID="971ca0ac-6de7-42f1-bf29-5174fd80ced4" containerID="a68fbdceade2850c22adba2de4b6078e94625dcade81e98faf63ffd294964ee5" exitCode=1 Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.469344 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"971ca0ac-6de7-42f1-bf29-5174fd80ced4","Type":"ContainerDied","Data":"a68fbdceade2850c22adba2de4b6078e94625dcade81e98faf63ffd294964ee5"} Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.469681 4851 scope.go:117] "RemoveContainer" containerID="c20a7b78f4412c0f21c222ebc0f47ef6d52387a25f64c9b0748dd8ea538d34b0" Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.470333 4851 scope.go:117] "RemoveContainer" containerID="a68fbdceade2850c22adba2de4b6078e94625dcade81e98faf63ffd294964ee5" Oct 01 13:14:22 crc kubenswrapper[4851]: E1001 13:14:22.470636 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(971ca0ac-6de7-42f1-bf29-5174fd80ced4)\"" pod="openstack/watcher-decision-engine-0" podUID="971ca0ac-6de7-42f1-bf29-5174fd80ced4" Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.483620 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0158e7a7-afdc-402e-8822-7ab194243f8e","Type":"ContainerStarted","Data":"df40e46d879d8b1a4ee46fb092effc1a6bf34f752a954b4eadbdd8e91ba06d8a"} Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.483828 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.523803 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.124734686 podStartE2EDuration="4.523770377s" podCreationTimestamp="2025-10-01 13:14:18 +0000 UTC" firstStartedPulling="2025-10-01 13:14:19.296316579 +0000 UTC m=+1267.641434085" lastFinishedPulling="2025-10-01 13:14:21.6953523 +0000 UTC m=+1270.040469776" observedRunningTime="2025-10-01 13:14:22.519274429 +0000 UTC m=+1270.864391925" watchObservedRunningTime="2025-10-01 13:14:22.523770377 +0000 UTC m=+1270.868887863" Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.729756 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.730103 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.761220 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.779079 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.862985 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.863031 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.892685 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 13:14:22 crc kubenswrapper[4851]: I1001 13:14:22.909005 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 13:14:23 crc kubenswrapper[4851]: I1001 13:14:23.441257 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:14:23 crc kubenswrapper[4851]: I1001 13:14:23.502896 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 13:14:23 crc kubenswrapper[4851]: I1001 13:14:23.502936 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 13:14:23 crc kubenswrapper[4851]: I1001 13:14:23.502996 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 13:14:23 crc kubenswrapper[4851]: I1001 13:14:23.503009 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 13:14:25 crc kubenswrapper[4851]: I1001 13:14:25.247561 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:14:25 crc kubenswrapper[4851]: I1001 13:14:25.247998 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:14:25 crc kubenswrapper[4851]: I1001 13:14:25.248024 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 01 13:14:25 crc kubenswrapper[4851]: I1001 13:14:25.248032 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:14:25 crc kubenswrapper[4851]: I1001 13:14:25.248673 4851 scope.go:117] "RemoveContainer" containerID="a68fbdceade2850c22adba2de4b6078e94625dcade81e98faf63ffd294964ee5" Oct 01 13:14:25 crc kubenswrapper[4851]: E1001 13:14:25.248873 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(971ca0ac-6de7-42f1-bf29-5174fd80ced4)\"" pod="openstack/watcher-decision-engine-0" podUID="971ca0ac-6de7-42f1-bf29-5174fd80ced4" Oct 01 13:14:25 crc kubenswrapper[4851]: I1001 13:14:25.481948 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-57698c5d89-m6vxz" Oct 01 13:14:25 crc kubenswrapper[4851]: I1001 13:14:25.556775 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6856b75994-rphb7"] Oct 01 13:14:25 crc kubenswrapper[4851]: I1001 13:14:25.556983 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6856b75994-rphb7" podUID="ff9726c5-9ce7-444b-9dfd-9b014835c375" containerName="neutron-api" containerID="cri-o://c1c97861f50ca0b69a42a4188b7b857ad0cc56dbbbaf4006b1fffd088aa3e99d" gracePeriod=30 Oct 01 13:14:25 crc kubenswrapper[4851]: I1001 13:14:25.557409 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6856b75994-rphb7" podUID="ff9726c5-9ce7-444b-9dfd-9b014835c375" containerName="neutron-httpd" containerID="cri-o://d0c62f9449fdd155f8a797dfa16bc07b25c56ac496c075ae5870c1b0433eb21c" gracePeriod=30 Oct 01 13:14:25 crc kubenswrapper[4851]: I1001 13:14:25.802209 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 13:14:25 crc kubenswrapper[4851]: I1001 13:14:25.802541 4851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:14:25 crc kubenswrapper[4851]: I1001 13:14:25.894109 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 13:14:25 crc kubenswrapper[4851]: I1001 13:14:25.894435 4851 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:14:26 crc kubenswrapper[4851]: I1001 13:14:26.019164 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 13:14:26 crc kubenswrapper[4851]: I1001 13:14:26.280096 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 13:14:26 crc kubenswrapper[4851]: I1001 13:14:26.538254 4851 generic.go:334] "Generic (PLEG): container finished" podID="ff9726c5-9ce7-444b-9dfd-9b014835c375" containerID="d0c62f9449fdd155f8a797dfa16bc07b25c56ac496c075ae5870c1b0433eb21c" exitCode=0 Oct 01 13:14:26 crc kubenswrapper[4851]: I1001 13:14:26.538334 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6856b75994-rphb7" event={"ID":"ff9726c5-9ce7-444b-9dfd-9b014835c375","Type":"ContainerDied","Data":"d0c62f9449fdd155f8a797dfa16bc07b25c56ac496c075ae5870c1b0433eb21c"} Oct 01 13:14:28 crc kubenswrapper[4851]: I1001 13:14:28.929676 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:28 crc kubenswrapper[4851]: I1001 13:14:28.930564 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="ceilometer-central-agent" containerID="cri-o://44b1b979c9a39b7de714608c2e88bf7abc331eaef1e5f2195cd802a8885c3a23" gracePeriod=30 Oct 01 13:14:28 crc kubenswrapper[4851]: I1001 13:14:28.931110 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="proxy-httpd" containerID="cri-o://df40e46d879d8b1a4ee46fb092effc1a6bf34f752a954b4eadbdd8e91ba06d8a" gracePeriod=30 Oct 01 13:14:28 crc kubenswrapper[4851]: I1001 13:14:28.931170 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="sg-core" containerID="cri-o://351a8782ce730dfc70ca3d3bbff35ee0417490abe9e8c745c832f5a8917fd981" gracePeriod=30 Oct 01 13:14:28 crc kubenswrapper[4851]: I1001 13:14:28.931213 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="ceilometer-notification-agent" containerID="cri-o://07b6c171909ccf521028f5fe6ed6f90a0cc8f0ac59fa5d8abe35e2367a469f21" gracePeriod=30 Oct 01 13:14:29 crc kubenswrapper[4851]: I1001 13:14:29.567850 4851 generic.go:334] "Generic (PLEG): container finished" podID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerID="df40e46d879d8b1a4ee46fb092effc1a6bf34f752a954b4eadbdd8e91ba06d8a" exitCode=0 Oct 01 13:14:29 crc kubenswrapper[4851]: I1001 13:14:29.568100 4851 generic.go:334] "Generic (PLEG): container finished" podID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerID="351a8782ce730dfc70ca3d3bbff35ee0417490abe9e8c745c832f5a8917fd981" exitCode=2 Oct 01 13:14:29 crc kubenswrapper[4851]: I1001 13:14:29.568165 4851 generic.go:334] "Generic (PLEG): container finished" podID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerID="44b1b979c9a39b7de714608c2e88bf7abc331eaef1e5f2195cd802a8885c3a23" exitCode=0 Oct 01 13:14:29 crc kubenswrapper[4851]: I1001 13:14:29.567897 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0158e7a7-afdc-402e-8822-7ab194243f8e","Type":"ContainerDied","Data":"df40e46d879d8b1a4ee46fb092effc1a6bf34f752a954b4eadbdd8e91ba06d8a"} Oct 01 13:14:29 crc kubenswrapper[4851]: I1001 13:14:29.568294 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0158e7a7-afdc-402e-8822-7ab194243f8e","Type":"ContainerDied","Data":"351a8782ce730dfc70ca3d3bbff35ee0417490abe9e8c745c832f5a8917fd981"} Oct 01 13:14:29 crc kubenswrapper[4851]: I1001 13:14:29.568363 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0158e7a7-afdc-402e-8822-7ab194243f8e","Type":"ContainerDied","Data":"44b1b979c9a39b7de714608c2e88bf7abc331eaef1e5f2195cd802a8885c3a23"} Oct 01 13:14:29 crc kubenswrapper[4851]: I1001 13:14:29.834749 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-31e1-account-create-rczp4"] Oct 01 13:14:29 crc kubenswrapper[4851]: I1001 13:14:29.836333 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-31e1-account-create-rczp4" Oct 01 13:14:29 crc kubenswrapper[4851]: I1001 13:14:29.839882 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 01 13:14:29 crc kubenswrapper[4851]: I1001 13:14:29.845824 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-31e1-account-create-rczp4"] Oct 01 13:14:29 crc kubenswrapper[4851]: I1001 13:14:29.991958 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gpkm\" (UniqueName: \"kubernetes.io/projected/417dd621-4a2f-45a8-8060-26e9584b1916-kube-api-access-4gpkm\") pod \"nova-api-31e1-account-create-rczp4\" (UID: \"417dd621-4a2f-45a8-8060-26e9584b1916\") " pod="openstack/nova-api-31e1-account-create-rczp4" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.012439 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b1ae-account-create-glmhj"] Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.014002 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b1ae-account-create-glmhj" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.016695 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.031281 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b1ae-account-create-glmhj"] Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.093984 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gpkm\" (UniqueName: \"kubernetes.io/projected/417dd621-4a2f-45a8-8060-26e9584b1916-kube-api-access-4gpkm\") pod \"nova-api-31e1-account-create-rczp4\" (UID: \"417dd621-4a2f-45a8-8060-26e9584b1916\") " pod="openstack/nova-api-31e1-account-create-rczp4" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.094377 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqcpp\" (UniqueName: \"kubernetes.io/projected/68d8656a-3cbc-4868-9e5e-4be315aa3a8d-kube-api-access-qqcpp\") pod \"nova-cell0-b1ae-account-create-glmhj\" (UID: \"68d8656a-3cbc-4868-9e5e-4be315aa3a8d\") " pod="openstack/nova-cell0-b1ae-account-create-glmhj" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.117169 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gpkm\" (UniqueName: \"kubernetes.io/projected/417dd621-4a2f-45a8-8060-26e9584b1916-kube-api-access-4gpkm\") pod \"nova-api-31e1-account-create-rczp4\" (UID: \"417dd621-4a2f-45a8-8060-26e9584b1916\") " pod="openstack/nova-api-31e1-account-create-rczp4" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.196166 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqcpp\" (UniqueName: \"kubernetes.io/projected/68d8656a-3cbc-4868-9e5e-4be315aa3a8d-kube-api-access-qqcpp\") pod \"nova-cell0-b1ae-account-create-glmhj\" (UID: \"68d8656a-3cbc-4868-9e5e-4be315aa3a8d\") " pod="openstack/nova-cell0-b1ae-account-create-glmhj" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.219968 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqcpp\" (UniqueName: \"kubernetes.io/projected/68d8656a-3cbc-4868-9e5e-4be315aa3a8d-kube-api-access-qqcpp\") pod \"nova-cell0-b1ae-account-create-glmhj\" (UID: \"68d8656a-3cbc-4868-9e5e-4be315aa3a8d\") " pod="openstack/nova-cell0-b1ae-account-create-glmhj" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.222172 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4ab6-account-create-tfwql"] Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.224157 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ab6-account-create-tfwql" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.225270 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-31e1-account-create-rczp4" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.230925 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.239825 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4ab6-account-create-tfwql"] Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.298097 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5mwp\" (UniqueName: \"kubernetes.io/projected/44f78fae-38e4-48f5-9318-34c712a404d2-kube-api-access-m5mwp\") pod \"nova-cell1-4ab6-account-create-tfwql\" (UID: \"44f78fae-38e4-48f5-9318-34c712a404d2\") " pod="openstack/nova-cell1-4ab6-account-create-tfwql" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.338928 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b1ae-account-create-glmhj" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.400851 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5mwp\" (UniqueName: \"kubernetes.io/projected/44f78fae-38e4-48f5-9318-34c712a404d2-kube-api-access-m5mwp\") pod \"nova-cell1-4ab6-account-create-tfwql\" (UID: \"44f78fae-38e4-48f5-9318-34c712a404d2\") " pod="openstack/nova-cell1-4ab6-account-create-tfwql" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.467209 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5mwp\" (UniqueName: \"kubernetes.io/projected/44f78fae-38e4-48f5-9318-34c712a404d2-kube-api-access-m5mwp\") pod \"nova-cell1-4ab6-account-create-tfwql\" (UID: \"44f78fae-38e4-48f5-9318-34c712a404d2\") " pod="openstack/nova-cell1-4ab6-account-create-tfwql" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.646976 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ab6-account-create-tfwql" Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.730742 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-31e1-account-create-rczp4"] Oct 01 13:14:30 crc kubenswrapper[4851]: W1001 13:14:30.744554 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod417dd621_4a2f_45a8_8060_26e9584b1916.slice/crio-86941642b88220508b1770b630553e0a7014ab5b78d916d5e8c2b6b9a10e39ff WatchSource:0}: Error finding container 86941642b88220508b1770b630553e0a7014ab5b78d916d5e8c2b6b9a10e39ff: Status 404 returned error can't find the container with id 86941642b88220508b1770b630553e0a7014ab5b78d916d5e8c2b6b9a10e39ff Oct 01 13:14:30 crc kubenswrapper[4851]: I1001 13:14:30.939072 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b1ae-account-create-glmhj"] Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.151068 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4ab6-account-create-tfwql"] Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.604320 4851 generic.go:334] "Generic (PLEG): container finished" podID="44f78fae-38e4-48f5-9318-34c712a404d2" containerID="77c9cfdf83b5158040fe6acf29bdbff56a92e63f3eed5e4825987259fd59c0a7" exitCode=0 Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.604790 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4ab6-account-create-tfwql" event={"ID":"44f78fae-38e4-48f5-9318-34c712a404d2","Type":"ContainerDied","Data":"77c9cfdf83b5158040fe6acf29bdbff56a92e63f3eed5e4825987259fd59c0a7"} Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.604821 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4ab6-account-create-tfwql" event={"ID":"44f78fae-38e4-48f5-9318-34c712a404d2","Type":"ContainerStarted","Data":"e3633dc8583badc41fb6c0f8c8207edc9413bcc509c15802731be9fab5dc1652"} Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.606760 4851 generic.go:334] "Generic (PLEG): container finished" podID="68d8656a-3cbc-4868-9e5e-4be315aa3a8d" containerID="5cccac4181ff635360aa92dde6191f39997a1d7e3be9fff27bb1844e58cddce5" exitCode=0 Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.606799 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b1ae-account-create-glmhj" event={"ID":"68d8656a-3cbc-4868-9e5e-4be315aa3a8d","Type":"ContainerDied","Data":"5cccac4181ff635360aa92dde6191f39997a1d7e3be9fff27bb1844e58cddce5"} Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.606816 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b1ae-account-create-glmhj" event={"ID":"68d8656a-3cbc-4868-9e5e-4be315aa3a8d","Type":"ContainerStarted","Data":"040a4c506bb54539c3de39f5a453073c8a9c5980f88278e07dd38813fb3ce95b"} Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.608789 4851 generic.go:334] "Generic (PLEG): container finished" podID="417dd621-4a2f-45a8-8060-26e9584b1916" containerID="69f651f01f7708fea9cf46f422c6823c2ccd1b53dd4dbf05a8778f2235def0c3" exitCode=0 Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.608832 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-31e1-account-create-rczp4" event={"ID":"417dd621-4a2f-45a8-8060-26e9584b1916","Type":"ContainerDied","Data":"69f651f01f7708fea9cf46f422c6823c2ccd1b53dd4dbf05a8778f2235def0c3"} Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.608847 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-31e1-account-create-rczp4" event={"ID":"417dd621-4a2f-45a8-8060-26e9584b1916","Type":"ContainerStarted","Data":"86941642b88220508b1770b630553e0a7014ab5b78d916d5e8c2b6b9a10e39ff"} Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.611199 4851 generic.go:334] "Generic (PLEG): container finished" podID="ff9726c5-9ce7-444b-9dfd-9b014835c375" containerID="c1c97861f50ca0b69a42a4188b7b857ad0cc56dbbbaf4006b1fffd088aa3e99d" exitCode=0 Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.611239 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6856b75994-rphb7" event={"ID":"ff9726c5-9ce7-444b-9dfd-9b014835c375","Type":"ContainerDied","Data":"c1c97861f50ca0b69a42a4188b7b857ad0cc56dbbbaf4006b1fffd088aa3e99d"} Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.615539 4851 generic.go:334] "Generic (PLEG): container finished" podID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerID="07b6c171909ccf521028f5fe6ed6f90a0cc8f0ac59fa5d8abe35e2367a469f21" exitCode=0 Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.615578 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0158e7a7-afdc-402e-8822-7ab194243f8e","Type":"ContainerDied","Data":"07b6c171909ccf521028f5fe6ed6f90a0cc8f0ac59fa5d8abe35e2367a469f21"} Oct 01 13:14:31 crc kubenswrapper[4851]: I1001 13:14:31.876081 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.038903 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5skp\" (UniqueName: \"kubernetes.io/projected/ff9726c5-9ce7-444b-9dfd-9b014835c375-kube-api-access-t5skp\") pod \"ff9726c5-9ce7-444b-9dfd-9b014835c375\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.038995 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-httpd-config\") pod \"ff9726c5-9ce7-444b-9dfd-9b014835c375\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.039028 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-ovndb-tls-certs\") pod \"ff9726c5-9ce7-444b-9dfd-9b014835c375\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.039070 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-config\") pod \"ff9726c5-9ce7-444b-9dfd-9b014835c375\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.039117 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-combined-ca-bundle\") pod \"ff9726c5-9ce7-444b-9dfd-9b014835c375\" (UID: \"ff9726c5-9ce7-444b-9dfd-9b014835c375\") " Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.043870 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff9726c5-9ce7-444b-9dfd-9b014835c375-kube-api-access-t5skp" (OuterVolumeSpecName: "kube-api-access-t5skp") pod "ff9726c5-9ce7-444b-9dfd-9b014835c375" (UID: "ff9726c5-9ce7-444b-9dfd-9b014835c375"). InnerVolumeSpecName "kube-api-access-t5skp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.045457 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ff9726c5-9ce7-444b-9dfd-9b014835c375" (UID: "ff9726c5-9ce7-444b-9dfd-9b014835c375"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.075900 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.110088 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-config" (OuterVolumeSpecName: "config") pod "ff9726c5-9ce7-444b-9dfd-9b014835c375" (UID: "ff9726c5-9ce7-444b-9dfd-9b014835c375"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.141721 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5skp\" (UniqueName: \"kubernetes.io/projected/ff9726c5-9ce7-444b-9dfd-9b014835c375-kube-api-access-t5skp\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.141781 4851 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.141798 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.155477 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ff9726c5-9ce7-444b-9dfd-9b014835c375" (UID: "ff9726c5-9ce7-444b-9dfd-9b014835c375"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.159695 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff9726c5-9ce7-444b-9dfd-9b014835c375" (UID: "ff9726c5-9ce7-444b-9dfd-9b014835c375"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.243542 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0158e7a7-afdc-402e-8822-7ab194243f8e-run-httpd\") pod \"0158e7a7-afdc-402e-8822-7ab194243f8e\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.243648 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0158e7a7-afdc-402e-8822-7ab194243f8e-log-httpd\") pod \"0158e7a7-afdc-402e-8822-7ab194243f8e\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.243669 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-scripts\") pod \"0158e7a7-afdc-402e-8822-7ab194243f8e\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.243707 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-combined-ca-bundle\") pod \"0158e7a7-afdc-402e-8822-7ab194243f8e\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.243741 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-config-data\") pod \"0158e7a7-afdc-402e-8822-7ab194243f8e\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.243844 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-sg-core-conf-yaml\") pod \"0158e7a7-afdc-402e-8822-7ab194243f8e\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.243896 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlcms\" (UniqueName: \"kubernetes.io/projected/0158e7a7-afdc-402e-8822-7ab194243f8e-kube-api-access-vlcms\") pod \"0158e7a7-afdc-402e-8822-7ab194243f8e\" (UID: \"0158e7a7-afdc-402e-8822-7ab194243f8e\") " Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.243906 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0158e7a7-afdc-402e-8822-7ab194243f8e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0158e7a7-afdc-402e-8822-7ab194243f8e" (UID: "0158e7a7-afdc-402e-8822-7ab194243f8e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.244149 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0158e7a7-afdc-402e-8822-7ab194243f8e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0158e7a7-afdc-402e-8822-7ab194243f8e" (UID: "0158e7a7-afdc-402e-8822-7ab194243f8e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.244528 4851 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0158e7a7-afdc-402e-8822-7ab194243f8e-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.244546 4851 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0158e7a7-afdc-402e-8822-7ab194243f8e-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.244556 4851 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.244566 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9726c5-9ce7-444b-9dfd-9b014835c375-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.250039 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-scripts" (OuterVolumeSpecName: "scripts") pod "0158e7a7-afdc-402e-8822-7ab194243f8e" (UID: "0158e7a7-afdc-402e-8822-7ab194243f8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.250309 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0158e7a7-afdc-402e-8822-7ab194243f8e-kube-api-access-vlcms" (OuterVolumeSpecName: "kube-api-access-vlcms") pod "0158e7a7-afdc-402e-8822-7ab194243f8e" (UID: "0158e7a7-afdc-402e-8822-7ab194243f8e"). InnerVolumeSpecName "kube-api-access-vlcms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.293020 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0158e7a7-afdc-402e-8822-7ab194243f8e" (UID: "0158e7a7-afdc-402e-8822-7ab194243f8e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.340812 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0158e7a7-afdc-402e-8822-7ab194243f8e" (UID: "0158e7a7-afdc-402e-8822-7ab194243f8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.346683 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.346714 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.346726 4851 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.346735 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlcms\" (UniqueName: \"kubernetes.io/projected/0158e7a7-afdc-402e-8822-7ab194243f8e-kube-api-access-vlcms\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.391746 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-config-data" (OuterVolumeSpecName: "config-data") pod "0158e7a7-afdc-402e-8822-7ab194243f8e" (UID: "0158e7a7-afdc-402e-8822-7ab194243f8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.448361 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158e7a7-afdc-402e-8822-7ab194243f8e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.627019 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6856b75994-rphb7" event={"ID":"ff9726c5-9ce7-444b-9dfd-9b014835c375","Type":"ContainerDied","Data":"b24bb130c93518b165533525734202bb0ecabcb180a5821f5ed6d635b57c777a"} Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.627066 4851 scope.go:117] "RemoveContainer" containerID="d0c62f9449fdd155f8a797dfa16bc07b25c56ac496c075ae5870c1b0433eb21c" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.627179 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6856b75994-rphb7" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.631789 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0158e7a7-afdc-402e-8822-7ab194243f8e","Type":"ContainerDied","Data":"ad6615454cb0c5242f75e3a0daf1828d7fb27094fdf87ce141d3acd025194489"} Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.631860 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.660370 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6856b75994-rphb7"] Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.678662 4851 scope.go:117] "RemoveContainer" containerID="c1c97861f50ca0b69a42a4188b7b857ad0cc56dbbbaf4006b1fffd088aa3e99d" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.682617 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6856b75994-rphb7"] Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.690154 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.705595 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.721569 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:32 crc kubenswrapper[4851]: E1001 13:14:32.722039 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9726c5-9ce7-444b-9dfd-9b014835c375" containerName="neutron-api" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.722061 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9726c5-9ce7-444b-9dfd-9b014835c375" containerName="neutron-api" Oct 01 13:14:32 crc kubenswrapper[4851]: E1001 13:14:32.722092 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="ceilometer-notification-agent" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.722103 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="ceilometer-notification-agent" Oct 01 13:14:32 crc kubenswrapper[4851]: E1001 13:14:32.722122 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9726c5-9ce7-444b-9dfd-9b014835c375" containerName="neutron-httpd" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.722130 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9726c5-9ce7-444b-9dfd-9b014835c375" containerName="neutron-httpd" Oct 01 13:14:32 crc kubenswrapper[4851]: E1001 13:14:32.722142 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="proxy-httpd" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.722150 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="proxy-httpd" Oct 01 13:14:32 crc kubenswrapper[4851]: E1001 13:14:32.722160 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="ceilometer-central-agent" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.722171 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="ceilometer-central-agent" Oct 01 13:14:32 crc kubenswrapper[4851]: E1001 13:14:32.722184 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="sg-core" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.722191 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="sg-core" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.722371 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9726c5-9ce7-444b-9dfd-9b014835c375" containerName="neutron-httpd" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.722383 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="ceilometer-notification-agent" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.722399 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="ceilometer-central-agent" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.722411 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="sg-core" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.722426 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" containerName="proxy-httpd" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.722443 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9726c5-9ce7-444b-9dfd-9b014835c375" containerName="neutron-api" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.724242 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.724450 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.726321 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.726736 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.736118 4851 scope.go:117] "RemoveContainer" containerID="df40e46d879d8b1a4ee46fb092effc1a6bf34f752a954b4eadbdd8e91ba06d8a" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.786045 4851 scope.go:117] "RemoveContainer" containerID="351a8782ce730dfc70ca3d3bbff35ee0417490abe9e8c745c832f5a8917fd981" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.809900 4851 scope.go:117] "RemoveContainer" containerID="07b6c171909ccf521028f5fe6ed6f90a0cc8f0ac59fa5d8abe35e2367a469f21" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.833675 4851 scope.go:117] "RemoveContainer" containerID="44b1b979c9a39b7de714608c2e88bf7abc331eaef1e5f2195cd802a8885c3a23" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.859718 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.859758 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6f9bb3a-4675-4149-ae26-1345c54e4d72-run-httpd\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.859877 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.859898 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck9hz\" (UniqueName: \"kubernetes.io/projected/f6f9bb3a-4675-4149-ae26-1345c54e4d72-kube-api-access-ck9hz\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.859917 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-config-data\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.859930 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-scripts\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.859954 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6f9bb3a-4675-4149-ae26-1345c54e4d72-log-httpd\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.963848 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.963906 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck9hz\" (UniqueName: \"kubernetes.io/projected/f6f9bb3a-4675-4149-ae26-1345c54e4d72-kube-api-access-ck9hz\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.963937 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-config-data\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.963957 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-scripts\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.963994 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6f9bb3a-4675-4149-ae26-1345c54e4d72-log-httpd\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.964050 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.964094 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6f9bb3a-4675-4149-ae26-1345c54e4d72-run-httpd\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.964697 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6f9bb3a-4675-4149-ae26-1345c54e4d72-run-httpd\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.965182 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6f9bb3a-4675-4149-ae26-1345c54e4d72-log-httpd\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.969687 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.977097 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-scripts\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.982048 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.983212 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-config-data\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:32 crc kubenswrapper[4851]: I1001 13:14:32.989932 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck9hz\" (UniqueName: \"kubernetes.io/projected/f6f9bb3a-4675-4149-ae26-1345c54e4d72-kube-api-access-ck9hz\") pod \"ceilometer-0\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " pod="openstack/ceilometer-0" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.050272 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.126545 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b1ae-account-create-glmhj" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.260728 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-31e1-account-create-rczp4" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.268238 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqcpp\" (UniqueName: \"kubernetes.io/projected/68d8656a-3cbc-4868-9e5e-4be315aa3a8d-kube-api-access-qqcpp\") pod \"68d8656a-3cbc-4868-9e5e-4be315aa3a8d\" (UID: \"68d8656a-3cbc-4868-9e5e-4be315aa3a8d\") " Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.268454 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ab6-account-create-tfwql" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.305670 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d8656a-3cbc-4868-9e5e-4be315aa3a8d-kube-api-access-qqcpp" (OuterVolumeSpecName: "kube-api-access-qqcpp") pod "68d8656a-3cbc-4868-9e5e-4be315aa3a8d" (UID: "68d8656a-3cbc-4868-9e5e-4be315aa3a8d"). InnerVolumeSpecName "kube-api-access-qqcpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.370747 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gpkm\" (UniqueName: \"kubernetes.io/projected/417dd621-4a2f-45a8-8060-26e9584b1916-kube-api-access-4gpkm\") pod \"417dd621-4a2f-45a8-8060-26e9584b1916\" (UID: \"417dd621-4a2f-45a8-8060-26e9584b1916\") " Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.370824 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5mwp\" (UniqueName: \"kubernetes.io/projected/44f78fae-38e4-48f5-9318-34c712a404d2-kube-api-access-m5mwp\") pod \"44f78fae-38e4-48f5-9318-34c712a404d2\" (UID: \"44f78fae-38e4-48f5-9318-34c712a404d2\") " Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.371318 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqcpp\" (UniqueName: \"kubernetes.io/projected/68d8656a-3cbc-4868-9e5e-4be315aa3a8d-kube-api-access-qqcpp\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.382127 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f78fae-38e4-48f5-9318-34c712a404d2-kube-api-access-m5mwp" (OuterVolumeSpecName: "kube-api-access-m5mwp") pod "44f78fae-38e4-48f5-9318-34c712a404d2" (UID: "44f78fae-38e4-48f5-9318-34c712a404d2"). InnerVolumeSpecName "kube-api-access-m5mwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.382259 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/417dd621-4a2f-45a8-8060-26e9584b1916-kube-api-access-4gpkm" (OuterVolumeSpecName: "kube-api-access-4gpkm") pod "417dd621-4a2f-45a8-8060-26e9584b1916" (UID: "417dd621-4a2f-45a8-8060-26e9584b1916"). InnerVolumeSpecName "kube-api-access-4gpkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.473238 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5mwp\" (UniqueName: \"kubernetes.io/projected/44f78fae-38e4-48f5-9318-34c712a404d2-kube-api-access-m5mwp\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.473281 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gpkm\" (UniqueName: \"kubernetes.io/projected/417dd621-4a2f-45a8-8060-26e9584b1916-kube-api-access-4gpkm\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.573834 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.615283 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.648883 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b1ae-account-create-glmhj" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.650598 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b1ae-account-create-glmhj" event={"ID":"68d8656a-3cbc-4868-9e5e-4be315aa3a8d","Type":"ContainerDied","Data":"040a4c506bb54539c3de39f5a453073c8a9c5980f88278e07dd38813fb3ce95b"} Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.650643 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="040a4c506bb54539c3de39f5a453073c8a9c5980f88278e07dd38813fb3ce95b" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.652423 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-31e1-account-create-rczp4" event={"ID":"417dd621-4a2f-45a8-8060-26e9584b1916","Type":"ContainerDied","Data":"86941642b88220508b1770b630553e0a7014ab5b78d916d5e8c2b6b9a10e39ff"} Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.652511 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-31e1-account-create-rczp4" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.652492 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86941642b88220508b1770b630553e0a7014ab5b78d916d5e8c2b6b9a10e39ff" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.655834 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6f9bb3a-4675-4149-ae26-1345c54e4d72","Type":"ContainerStarted","Data":"df0f62288203b001164e676d7af6242c11813578bcb3e40edc1ed06782f5816d"} Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.660184 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4ab6-account-create-tfwql" event={"ID":"44f78fae-38e4-48f5-9318-34c712a404d2","Type":"ContainerDied","Data":"e3633dc8583badc41fb6c0f8c8207edc9413bcc509c15802731be9fab5dc1652"} Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.660223 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ab6-account-create-tfwql" Oct 01 13:14:33 crc kubenswrapper[4851]: I1001 13:14:33.660236 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3633dc8583badc41fb6c0f8c8207edc9413bcc509c15802731be9fab5dc1652" Oct 01 13:14:34 crc kubenswrapper[4851]: I1001 13:14:34.341622 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0158e7a7-afdc-402e-8822-7ab194243f8e" path="/var/lib/kubelet/pods/0158e7a7-afdc-402e-8822-7ab194243f8e/volumes" Oct 01 13:14:34 crc kubenswrapper[4851]: I1001 13:14:34.343015 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff9726c5-9ce7-444b-9dfd-9b014835c375" path="/var/lib/kubelet/pods/ff9726c5-9ce7-444b-9dfd-9b014835c375/volumes" Oct 01 13:14:34 crc kubenswrapper[4851]: I1001 13:14:34.677549 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6f9bb3a-4675-4149-ae26-1345c54e4d72","Type":"ContainerStarted","Data":"8f653a1b09ffa74b7a05a17a60ca76bc9d30dbbc8ea5cb4c9445fa3e04466936"} Oct 01 13:14:34 crc kubenswrapper[4851]: I1001 13:14:34.677770 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6f9bb3a-4675-4149-ae26-1345c54e4d72","Type":"ContainerStarted","Data":"74e797d6b0482d82c3eec450e49e4d3d1710a946b054e9868bb4c3778f319770"} Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.332345 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dx6g6"] Oct 01 13:14:35 crc kubenswrapper[4851]: E1001 13:14:35.333206 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f78fae-38e4-48f5-9318-34c712a404d2" containerName="mariadb-account-create" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.333230 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f78fae-38e4-48f5-9318-34c712a404d2" containerName="mariadb-account-create" Oct 01 13:14:35 crc kubenswrapper[4851]: E1001 13:14:35.333245 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="417dd621-4a2f-45a8-8060-26e9584b1916" containerName="mariadb-account-create" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.333273 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="417dd621-4a2f-45a8-8060-26e9584b1916" containerName="mariadb-account-create" Oct 01 13:14:35 crc kubenswrapper[4851]: E1001 13:14:35.333290 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d8656a-3cbc-4868-9e5e-4be315aa3a8d" containerName="mariadb-account-create" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.333298 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d8656a-3cbc-4868-9e5e-4be315aa3a8d" containerName="mariadb-account-create" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.333548 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="417dd621-4a2f-45a8-8060-26e9584b1916" containerName="mariadb-account-create" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.333582 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f78fae-38e4-48f5-9318-34c712a404d2" containerName="mariadb-account-create" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.333596 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d8656a-3cbc-4868-9e5e-4be315aa3a8d" containerName="mariadb-account-create" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.334306 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.344715 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dx6g6"] Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.347624 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.347887 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-8crgs" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.348325 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.408791 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjhkv\" (UniqueName: \"kubernetes.io/projected/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-kube-api-access-tjhkv\") pod \"nova-cell0-conductor-db-sync-dx6g6\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.408859 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-scripts\") pod \"nova-cell0-conductor-db-sync-dx6g6\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.408931 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dx6g6\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.409003 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-config-data\") pod \"nova-cell0-conductor-db-sync-dx6g6\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.510864 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjhkv\" (UniqueName: \"kubernetes.io/projected/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-kube-api-access-tjhkv\") pod \"nova-cell0-conductor-db-sync-dx6g6\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.510926 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-scripts\") pod \"nova-cell0-conductor-db-sync-dx6g6\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.511006 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dx6g6\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.511082 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-config-data\") pod \"nova-cell0-conductor-db-sync-dx6g6\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.518323 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-scripts\") pod \"nova-cell0-conductor-db-sync-dx6g6\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.518390 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dx6g6\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.519319 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-config-data\") pod \"nova-cell0-conductor-db-sync-dx6g6\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.529406 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjhkv\" (UniqueName: \"kubernetes.io/projected/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-kube-api-access-tjhkv\") pod \"nova-cell0-conductor-db-sync-dx6g6\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.655112 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:14:35 crc kubenswrapper[4851]: I1001 13:14:35.695933 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6f9bb3a-4675-4149-ae26-1345c54e4d72","Type":"ContainerStarted","Data":"51f449af07f3c853203ad626d7bd99b3b641d3684c0906d38ba58d1c931e2890"} Oct 01 13:14:36 crc kubenswrapper[4851]: I1001 13:14:36.169003 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dx6g6"] Oct 01 13:14:36 crc kubenswrapper[4851]: W1001 13:14:36.185805 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a3816a8_ed2c_4f4c_a4ba_485b9182a68e.slice/crio-acc557d8a3ab34beff6079af3e4a320d42d3d19541eeeba34577d9c2d6202fa0 WatchSource:0}: Error finding container acc557d8a3ab34beff6079af3e4a320d42d3d19541eeeba34577d9c2d6202fa0: Status 404 returned error can't find the container with id acc557d8a3ab34beff6079af3e4a320d42d3d19541eeeba34577d9c2d6202fa0 Oct 01 13:14:36 crc kubenswrapper[4851]: I1001 13:14:36.716441 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dx6g6" event={"ID":"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e","Type":"ContainerStarted","Data":"acc557d8a3ab34beff6079af3e4a320d42d3d19541eeeba34577d9c2d6202fa0"} Oct 01 13:14:38 crc kubenswrapper[4851]: I1001 13:14:38.329273 4851 scope.go:117] "RemoveContainer" containerID="a68fbdceade2850c22adba2de4b6078e94625dcade81e98faf63ffd294964ee5" Oct 01 13:14:38 crc kubenswrapper[4851]: E1001 13:14:38.330025 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(971ca0ac-6de7-42f1-bf29-5174fd80ced4)\"" pod="openstack/watcher-decision-engine-0" podUID="971ca0ac-6de7-42f1-bf29-5174fd80ced4" Oct 01 13:14:38 crc kubenswrapper[4851]: I1001 13:14:38.739102 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6f9bb3a-4675-4149-ae26-1345c54e4d72","Type":"ContainerStarted","Data":"63af865fbee25941e06d1450859a9846d602e1399c6d4901786ffadbd26b7a01"} Oct 01 13:14:38 crc kubenswrapper[4851]: I1001 13:14:38.739246 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="ceilometer-central-agent" containerID="cri-o://74e797d6b0482d82c3eec450e49e4d3d1710a946b054e9868bb4c3778f319770" gracePeriod=30 Oct 01 13:14:38 crc kubenswrapper[4851]: I1001 13:14:38.739461 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:14:38 crc kubenswrapper[4851]: I1001 13:14:38.739833 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="proxy-httpd" containerID="cri-o://63af865fbee25941e06d1450859a9846d602e1399c6d4901786ffadbd26b7a01" gracePeriod=30 Oct 01 13:14:38 crc kubenswrapper[4851]: I1001 13:14:38.739872 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="ceilometer-notification-agent" containerID="cri-o://8f653a1b09ffa74b7a05a17a60ca76bc9d30dbbc8ea5cb4c9445fa3e04466936" gracePeriod=30 Oct 01 13:14:38 crc kubenswrapper[4851]: I1001 13:14:38.739990 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="sg-core" containerID="cri-o://51f449af07f3c853203ad626d7bd99b3b641d3684c0906d38ba58d1c931e2890" gracePeriod=30 Oct 01 13:14:38 crc kubenswrapper[4851]: I1001 13:14:38.768807 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.79037581 podStartE2EDuration="6.768782345s" podCreationTimestamp="2025-10-01 13:14:32 +0000 UTC" firstStartedPulling="2025-10-01 13:14:33.57755232 +0000 UTC m=+1281.922669806" lastFinishedPulling="2025-10-01 13:14:37.555958865 +0000 UTC m=+1285.901076341" observedRunningTime="2025-10-01 13:14:38.76159042 +0000 UTC m=+1287.106707906" watchObservedRunningTime="2025-10-01 13:14:38.768782345 +0000 UTC m=+1287.113899832" Oct 01 13:14:39 crc kubenswrapper[4851]: I1001 13:14:39.750284 4851 generic.go:334] "Generic (PLEG): container finished" podID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerID="63af865fbee25941e06d1450859a9846d602e1399c6d4901786ffadbd26b7a01" exitCode=0 Oct 01 13:14:39 crc kubenswrapper[4851]: I1001 13:14:39.750627 4851 generic.go:334] "Generic (PLEG): container finished" podID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerID="51f449af07f3c853203ad626d7bd99b3b641d3684c0906d38ba58d1c931e2890" exitCode=2 Oct 01 13:14:39 crc kubenswrapper[4851]: I1001 13:14:39.750639 4851 generic.go:334] "Generic (PLEG): container finished" podID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerID="8f653a1b09ffa74b7a05a17a60ca76bc9d30dbbc8ea5cb4c9445fa3e04466936" exitCode=0 Oct 01 13:14:39 crc kubenswrapper[4851]: I1001 13:14:39.750354 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6f9bb3a-4675-4149-ae26-1345c54e4d72","Type":"ContainerDied","Data":"63af865fbee25941e06d1450859a9846d602e1399c6d4901786ffadbd26b7a01"} Oct 01 13:14:39 crc kubenswrapper[4851]: I1001 13:14:39.750680 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6f9bb3a-4675-4149-ae26-1345c54e4d72","Type":"ContainerDied","Data":"51f449af07f3c853203ad626d7bd99b3b641d3684c0906d38ba58d1c931e2890"} Oct 01 13:14:39 crc kubenswrapper[4851]: I1001 13:14:39.750699 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6f9bb3a-4675-4149-ae26-1345c54e4d72","Type":"ContainerDied","Data":"8f653a1b09ffa74b7a05a17a60ca76bc9d30dbbc8ea5cb4c9445fa3e04466936"} Oct 01 13:14:41 crc kubenswrapper[4851]: I1001 13:14:41.770797 4851 generic.go:334] "Generic (PLEG): container finished" podID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerID="74e797d6b0482d82c3eec450e49e4d3d1710a946b054e9868bb4c3778f319770" exitCode=0 Oct 01 13:14:41 crc kubenswrapper[4851]: I1001 13:14:41.771409 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6f9bb3a-4675-4149-ae26-1345c54e4d72","Type":"ContainerDied","Data":"74e797d6b0482d82c3eec450e49e4d3d1710a946b054e9868bb4c3778f319770"} Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.497118 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.624336 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-scripts\") pod \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.624381 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-config-data\") pod \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.624477 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6f9bb3a-4675-4149-ae26-1345c54e4d72-run-httpd\") pod \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.624523 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-combined-ca-bundle\") pod \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.624556 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck9hz\" (UniqueName: \"kubernetes.io/projected/f6f9bb3a-4675-4149-ae26-1345c54e4d72-kube-api-access-ck9hz\") pod \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.625219 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6f9bb3a-4675-4149-ae26-1345c54e4d72-log-httpd\") pod \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.625267 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-sg-core-conf-yaml\") pod \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\" (UID: \"f6f9bb3a-4675-4149-ae26-1345c54e4d72\") " Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.625561 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f9bb3a-4675-4149-ae26-1345c54e4d72-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f6f9bb3a-4675-4149-ae26-1345c54e4d72" (UID: "f6f9bb3a-4675-4149-ae26-1345c54e4d72"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.625696 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f9bb3a-4675-4149-ae26-1345c54e4d72-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f6f9bb3a-4675-4149-ae26-1345c54e4d72" (UID: "f6f9bb3a-4675-4149-ae26-1345c54e4d72"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.630021 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f9bb3a-4675-4149-ae26-1345c54e4d72-kube-api-access-ck9hz" (OuterVolumeSpecName: "kube-api-access-ck9hz") pod "f6f9bb3a-4675-4149-ae26-1345c54e4d72" (UID: "f6f9bb3a-4675-4149-ae26-1345c54e4d72"). InnerVolumeSpecName "kube-api-access-ck9hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.630667 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-scripts" (OuterVolumeSpecName: "scripts") pod "f6f9bb3a-4675-4149-ae26-1345c54e4d72" (UID: "f6f9bb3a-4675-4149-ae26-1345c54e4d72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.668081 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f6f9bb3a-4675-4149-ae26-1345c54e4d72" (UID: "f6f9bb3a-4675-4149-ae26-1345c54e4d72"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.708807 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6f9bb3a-4675-4149-ae26-1345c54e4d72" (UID: "f6f9bb3a-4675-4149-ae26-1345c54e4d72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.727894 4851 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6f9bb3a-4675-4149-ae26-1345c54e4d72-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.727929 4851 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.727940 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.727949 4851 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6f9bb3a-4675-4149-ae26-1345c54e4d72-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.727956 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.727965 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck9hz\" (UniqueName: \"kubernetes.io/projected/f6f9bb3a-4675-4149-ae26-1345c54e4d72-kube-api-access-ck9hz\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.751756 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-config-data" (OuterVolumeSpecName: "config-data") pod "f6f9bb3a-4675-4149-ae26-1345c54e4d72" (UID: "f6f9bb3a-4675-4149-ae26-1345c54e4d72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.830341 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f9bb3a-4675-4149-ae26-1345c54e4d72-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.836366 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6f9bb3a-4675-4149-ae26-1345c54e4d72","Type":"ContainerDied","Data":"df0f62288203b001164e676d7af6242c11813578bcb3e40edc1ed06782f5816d"} Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.836439 4851 scope.go:117] "RemoveContainer" containerID="63af865fbee25941e06d1450859a9846d602e1399c6d4901786ffadbd26b7a01" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.836606 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.839005 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dx6g6" event={"ID":"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e","Type":"ContainerStarted","Data":"dd1647f0669f311c2c2126b67d0997ad5e6912fd8eaf2618076fcd267c28eabb"} Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.874935 4851 scope.go:117] "RemoveContainer" containerID="51f449af07f3c853203ad626d7bd99b3b641d3684c0906d38ba58d1c931e2890" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.880516 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-dx6g6" podStartSLOduration=1.851499606 podStartE2EDuration="10.88045905s" podCreationTimestamp="2025-10-01 13:14:35 +0000 UTC" firstStartedPulling="2025-10-01 13:14:36.189924001 +0000 UTC m=+1284.535041487" lastFinishedPulling="2025-10-01 13:14:45.218883425 +0000 UTC m=+1293.564000931" observedRunningTime="2025-10-01 13:14:45.861223871 +0000 UTC m=+1294.206341357" watchObservedRunningTime="2025-10-01 13:14:45.88045905 +0000 UTC m=+1294.225576536" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.901933 4851 scope.go:117] "RemoveContainer" containerID="8f653a1b09ffa74b7a05a17a60ca76bc9d30dbbc8ea5cb4c9445fa3e04466936" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.905645 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.925963 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.928539 4851 scope.go:117] "RemoveContainer" containerID="74e797d6b0482d82c3eec450e49e4d3d1710a946b054e9868bb4c3778f319770" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.934907 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:45 crc kubenswrapper[4851]: E1001 13:14:45.935326 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="proxy-httpd" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.935339 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="proxy-httpd" Oct 01 13:14:45 crc kubenswrapper[4851]: E1001 13:14:45.935354 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="ceilometer-central-agent" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.935362 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="ceilometer-central-agent" Oct 01 13:14:45 crc kubenswrapper[4851]: E1001 13:14:45.935386 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="ceilometer-notification-agent" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.935393 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="ceilometer-notification-agent" Oct 01 13:14:45 crc kubenswrapper[4851]: E1001 13:14:45.935410 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="sg-core" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.935417 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="sg-core" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.935628 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="ceilometer-central-agent" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.935650 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="proxy-httpd" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.935669 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="ceilometer-notification-agent" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.935682 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" containerName="sg-core" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.937738 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.941398 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.941578 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:14:45 crc kubenswrapper[4851]: I1001 13:14:45.962567 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.033460 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.033538 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-run-httpd\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.033572 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.033634 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-log-httpd\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.033699 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-scripts\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.033758 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfghl\" (UniqueName: \"kubernetes.io/projected/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-kube-api-access-pfghl\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.033784 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-config-data\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.135103 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.135154 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-run-httpd\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.135177 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.135222 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-log-httpd\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.135239 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-scripts\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.135278 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfghl\" (UniqueName: \"kubernetes.io/projected/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-kube-api-access-pfghl\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.135297 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-config-data\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.136142 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-log-httpd\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.136265 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-run-httpd\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.139249 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-config-data\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.139732 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.140079 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-scripts\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.139541 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.162471 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfghl\" (UniqueName: \"kubernetes.io/projected/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-kube-api-access-pfghl\") pod \"ceilometer-0\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.256642 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.367939 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f9bb3a-4675-4149-ae26-1345c54e4d72" path="/var/lib/kubelet/pods/f6f9bb3a-4675-4149-ae26-1345c54e4d72/volumes" Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.782002 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:46 crc kubenswrapper[4851]: I1001 13:14:46.852136 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2e0d121-a70d-4191-b1be-0743c0dc5cb3","Type":"ContainerStarted","Data":"2e3e84aaeb7bbd4d9bfd18303ef7c36162eab14b1fcf71cd7cd68457f946cab3"} Oct 01 13:14:47 crc kubenswrapper[4851]: I1001 13:14:47.864248 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2e0d121-a70d-4191-b1be-0743c0dc5cb3","Type":"ContainerStarted","Data":"4a99d3eb3b18bcb979c9ac4efa2b4410c6d7b7b346af69d860cef8213d509ce1"} Oct 01 13:14:47 crc kubenswrapper[4851]: I1001 13:14:47.864813 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2e0d121-a70d-4191-b1be-0743c0dc5cb3","Type":"ContainerStarted","Data":"dcf80642bb4507cbb9aea4ca42b0c1767fb5107d1f4e26cf34d59ae8ec8ec052"} Oct 01 13:14:48 crc kubenswrapper[4851]: I1001 13:14:48.883845 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2e0d121-a70d-4191-b1be-0743c0dc5cb3","Type":"ContainerStarted","Data":"b05483be58abda3dc0e456d94214672896a0e78abb1c7839d792ed7977325612"} Oct 01 13:14:49 crc kubenswrapper[4851]: I1001 13:14:49.894196 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2e0d121-a70d-4191-b1be-0743c0dc5cb3","Type":"ContainerStarted","Data":"f0adab6aff3a94c0f7a7f34f876f84a98347dc28d6192c9c371ae874a6616283"} Oct 01 13:14:49 crc kubenswrapper[4851]: I1001 13:14:49.895676 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:14:49 crc kubenswrapper[4851]: I1001 13:14:49.926528 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.160192175 podStartE2EDuration="4.926489247s" podCreationTimestamp="2025-10-01 13:14:45 +0000 UTC" firstStartedPulling="2025-10-01 13:14:46.793183423 +0000 UTC m=+1295.138300909" lastFinishedPulling="2025-10-01 13:14:49.559480495 +0000 UTC m=+1297.904597981" observedRunningTime="2025-10-01 13:14:49.921029159 +0000 UTC m=+1298.266146645" watchObservedRunningTime="2025-10-01 13:14:49.926489247 +0000 UTC m=+1298.271606733" Oct 01 13:14:51 crc kubenswrapper[4851]: I1001 13:14:51.328553 4851 scope.go:117] "RemoveContainer" containerID="a68fbdceade2850c22adba2de4b6078e94625dcade81e98faf63ffd294964ee5" Oct 01 13:14:51 crc kubenswrapper[4851]: E1001 13:14:51.329577 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(971ca0ac-6de7-42f1-bf29-5174fd80ced4)\"" pod="openstack/watcher-decision-engine-0" podUID="971ca0ac-6de7-42f1-bf29-5174fd80ced4" Oct 01 13:14:52 crc kubenswrapper[4851]: I1001 13:14:52.999224 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:52 crc kubenswrapper[4851]: I1001 13:14:52.999681 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="sg-core" containerID="cri-o://b05483be58abda3dc0e456d94214672896a0e78abb1c7839d792ed7977325612" gracePeriod=30 Oct 01 13:14:52 crc kubenswrapper[4851]: I1001 13:14:52.999787 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="ceilometer-notification-agent" containerID="cri-o://4a99d3eb3b18bcb979c9ac4efa2b4410c6d7b7b346af69d860cef8213d509ce1" gracePeriod=30 Oct 01 13:14:52 crc kubenswrapper[4851]: I1001 13:14:52.999555 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="ceilometer-central-agent" containerID="cri-o://dcf80642bb4507cbb9aea4ca42b0c1767fb5107d1f4e26cf34d59ae8ec8ec052" gracePeriod=30 Oct 01 13:14:52 crc kubenswrapper[4851]: I1001 13:14:52.999679 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="proxy-httpd" containerID="cri-o://f0adab6aff3a94c0f7a7f34f876f84a98347dc28d6192c9c371ae874a6616283" gracePeriod=30 Oct 01 13:14:53 crc kubenswrapper[4851]: I1001 13:14:53.934994 4851 generic.go:334] "Generic (PLEG): container finished" podID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerID="f0adab6aff3a94c0f7a7f34f876f84a98347dc28d6192c9c371ae874a6616283" exitCode=0 Oct 01 13:14:53 crc kubenswrapper[4851]: I1001 13:14:53.935335 4851 generic.go:334] "Generic (PLEG): container finished" podID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerID="b05483be58abda3dc0e456d94214672896a0e78abb1c7839d792ed7977325612" exitCode=2 Oct 01 13:14:53 crc kubenswrapper[4851]: I1001 13:14:53.935345 4851 generic.go:334] "Generic (PLEG): container finished" podID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerID="4a99d3eb3b18bcb979c9ac4efa2b4410c6d7b7b346af69d860cef8213d509ce1" exitCode=0 Oct 01 13:14:53 crc kubenswrapper[4851]: I1001 13:14:53.935076 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2e0d121-a70d-4191-b1be-0743c0dc5cb3","Type":"ContainerDied","Data":"f0adab6aff3a94c0f7a7f34f876f84a98347dc28d6192c9c371ae874a6616283"} Oct 01 13:14:53 crc kubenswrapper[4851]: I1001 13:14:53.935380 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2e0d121-a70d-4191-b1be-0743c0dc5cb3","Type":"ContainerDied","Data":"b05483be58abda3dc0e456d94214672896a0e78abb1c7839d792ed7977325612"} Oct 01 13:14:53 crc kubenswrapper[4851]: I1001 13:14:53.935394 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2e0d121-a70d-4191-b1be-0743c0dc5cb3","Type":"ContainerDied","Data":"4a99d3eb3b18bcb979c9ac4efa2b4410c6d7b7b346af69d860cef8213d509ce1"} Oct 01 13:14:54 crc kubenswrapper[4851]: I1001 13:14:54.948308 4851 generic.go:334] "Generic (PLEG): container finished" podID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerID="dcf80642bb4507cbb9aea4ca42b0c1767fb5107d1f4e26cf34d59ae8ec8ec052" exitCode=0 Oct 01 13:14:54 crc kubenswrapper[4851]: I1001 13:14:54.948422 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2e0d121-a70d-4191-b1be-0743c0dc5cb3","Type":"ContainerDied","Data":"dcf80642bb4507cbb9aea4ca42b0c1767fb5107d1f4e26cf34d59ae8ec8ec052"} Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.337340 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.438827 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-log-httpd\") pod \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.438886 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-scripts\") pod \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.438977 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfghl\" (UniqueName: \"kubernetes.io/projected/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-kube-api-access-pfghl\") pod \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.439053 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-config-data\") pod \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.439078 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-combined-ca-bundle\") pod \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.439131 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-run-httpd\") pod \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.439213 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-sg-core-conf-yaml\") pod \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\" (UID: \"a2e0d121-a70d-4191-b1be-0743c0dc5cb3\") " Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.439567 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a2e0d121-a70d-4191-b1be-0743c0dc5cb3" (UID: "a2e0d121-a70d-4191-b1be-0743c0dc5cb3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.439770 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a2e0d121-a70d-4191-b1be-0743c0dc5cb3" (UID: "a2e0d121-a70d-4191-b1be-0743c0dc5cb3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.440259 4851 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.440277 4851 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.446166 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-scripts" (OuterVolumeSpecName: "scripts") pod "a2e0d121-a70d-4191-b1be-0743c0dc5cb3" (UID: "a2e0d121-a70d-4191-b1be-0743c0dc5cb3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.461998 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-kube-api-access-pfghl" (OuterVolumeSpecName: "kube-api-access-pfghl") pod "a2e0d121-a70d-4191-b1be-0743c0dc5cb3" (UID: "a2e0d121-a70d-4191-b1be-0743c0dc5cb3"). InnerVolumeSpecName "kube-api-access-pfghl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.475471 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a2e0d121-a70d-4191-b1be-0743c0dc5cb3" (UID: "a2e0d121-a70d-4191-b1be-0743c0dc5cb3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.534475 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2e0d121-a70d-4191-b1be-0743c0dc5cb3" (UID: "a2e0d121-a70d-4191-b1be-0743c0dc5cb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.542015 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.542048 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfghl\" (UniqueName: \"kubernetes.io/projected/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-kube-api-access-pfghl\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.542059 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.542070 4851 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.545778 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-config-data" (OuterVolumeSpecName: "config-data") pod "a2e0d121-a70d-4191-b1be-0743c0dc5cb3" (UID: "a2e0d121-a70d-4191-b1be-0743c0dc5cb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.644093 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2e0d121-a70d-4191-b1be-0743c0dc5cb3-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.964182 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2e0d121-a70d-4191-b1be-0743c0dc5cb3","Type":"ContainerDied","Data":"2e3e84aaeb7bbd4d9bfd18303ef7c36162eab14b1fcf71cd7cd68457f946cab3"} Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.964247 4851 scope.go:117] "RemoveContainer" containerID="f0adab6aff3a94c0f7a7f34f876f84a98347dc28d6192c9c371ae874a6616283" Oct 01 13:14:55 crc kubenswrapper[4851]: I1001 13:14:55.964480 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.008380 4851 scope.go:117] "RemoveContainer" containerID="b05483be58abda3dc0e456d94214672896a0e78abb1c7839d792ed7977325612" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.012802 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.027927 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.040469 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:56 crc kubenswrapper[4851]: E1001 13:14:56.040946 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="sg-core" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.040963 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="sg-core" Oct 01 13:14:56 crc kubenswrapper[4851]: E1001 13:14:56.040984 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="ceilometer-central-agent" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.040990 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="ceilometer-central-agent" Oct 01 13:14:56 crc kubenswrapper[4851]: E1001 13:14:56.041005 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="ceilometer-notification-agent" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.041012 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="ceilometer-notification-agent" Oct 01 13:14:56 crc kubenswrapper[4851]: E1001 13:14:56.041024 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="proxy-httpd" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.041030 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="proxy-httpd" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.041218 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="ceilometer-notification-agent" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.041229 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="ceilometer-central-agent" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.041239 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="proxy-httpd" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.041252 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" containerName="sg-core" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.042858 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.044852 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.046742 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.047750 4851 scope.go:117] "RemoveContainer" containerID="4a99d3eb3b18bcb979c9ac4efa2b4410c6d7b7b346af69d860cef8213d509ce1" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.050835 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.081171 4851 scope.go:117] "RemoveContainer" containerID="dcf80642bb4507cbb9aea4ca42b0c1767fb5107d1f4e26cf34d59ae8ec8ec052" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.156971 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.157130 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-run-httpd\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.157203 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ctrt\" (UniqueName: \"kubernetes.io/projected/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-kube-api-access-4ctrt\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.157279 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.157376 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-log-httpd\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.157519 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-config-data\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.157783 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-scripts\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.260243 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-scripts\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.260390 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.260459 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-run-httpd\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.260487 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ctrt\" (UniqueName: \"kubernetes.io/projected/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-kube-api-access-4ctrt\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.260549 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.260609 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-log-httpd\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.260646 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-config-data\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.261221 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-run-httpd\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.261586 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-log-httpd\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.264389 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.264857 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-config-data\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.265399 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.266140 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-scripts\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.280639 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ctrt\" (UniqueName: \"kubernetes.io/projected/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-kube-api-access-4ctrt\") pod \"ceilometer-0\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.341414 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2e0d121-a70d-4191-b1be-0743c0dc5cb3" path="/var/lib/kubelet/pods/a2e0d121-a70d-4191-b1be-0743c0dc5cb3/volumes" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.372571 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.902034 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:14:56 crc kubenswrapper[4851]: I1001 13:14:56.985665 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe30e726-ab37-4631-ac05-9d8bd9fdc11d","Type":"ContainerStarted","Data":"333f1abc7f6a0888e59f60685b953575bd5fe206d8c3d9b57aa62986623505ee"} Oct 01 13:14:58 crc kubenswrapper[4851]: I1001 13:14:57.999609 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe30e726-ab37-4631-ac05-9d8bd9fdc11d","Type":"ContainerStarted","Data":"21fe2fda604a2deb41cb3d92c904d646bc494b49c80cf037831b88ac0604badf"} Oct 01 13:14:58 crc kubenswrapper[4851]: I1001 13:14:58.000207 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe30e726-ab37-4631-ac05-9d8bd9fdc11d","Type":"ContainerStarted","Data":"32a17062b9da1995cd1c182b579b4ded2007df740da4b0c82735407b44f536bc"} Oct 01 13:14:59 crc kubenswrapper[4851]: I1001 13:14:59.011242 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe30e726-ab37-4631-ac05-9d8bd9fdc11d","Type":"ContainerStarted","Data":"d1049ceca9f4e8154f1b9b3e8f4314a6f27ddd2c51be7e3439319f6a7918d733"} Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.023496 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe30e726-ab37-4631-ac05-9d8bd9fdc11d","Type":"ContainerStarted","Data":"0d7557d025ec42c72d6586138fc901a704a8f48a305f88df5bea6a1e240b56d4"} Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.023899 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.051303 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.50806964 podStartE2EDuration="4.05127448s" podCreationTimestamp="2025-10-01 13:14:56 +0000 UTC" firstStartedPulling="2025-10-01 13:14:56.919049611 +0000 UTC m=+1305.264167097" lastFinishedPulling="2025-10-01 13:14:59.462254451 +0000 UTC m=+1307.807371937" observedRunningTime="2025-10-01 13:15:00.047491957 +0000 UTC m=+1308.392609443" watchObservedRunningTime="2025-10-01 13:15:00.05127448 +0000 UTC m=+1308.396391996" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.157602 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b"] Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.159477 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.164958 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.165319 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.189891 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b"] Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.249720 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-config-volume\") pod \"collect-profiles-29322075-r692b\" (UID: \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.249780 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-secret-volume\") pod \"collect-profiles-29322075-r692b\" (UID: \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.249849 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4v8w\" (UniqueName: \"kubernetes.io/projected/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-kube-api-access-h4v8w\") pod \"collect-profiles-29322075-r692b\" (UID: \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.352753 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-config-volume\") pod \"collect-profiles-29322075-r692b\" (UID: \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.352950 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-secret-volume\") pod \"collect-profiles-29322075-r692b\" (UID: \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.353033 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4v8w\" (UniqueName: \"kubernetes.io/projected/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-kube-api-access-h4v8w\") pod \"collect-profiles-29322075-r692b\" (UID: \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.354929 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-config-volume\") pod \"collect-profiles-29322075-r692b\" (UID: \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.367358 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-secret-volume\") pod \"collect-profiles-29322075-r692b\" (UID: \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.377273 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4v8w\" (UniqueName: \"kubernetes.io/projected/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-kube-api-access-h4v8w\") pod \"collect-profiles-29322075-r692b\" (UID: \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.487868 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" Oct 01 13:15:00 crc kubenswrapper[4851]: I1001 13:15:00.958952 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b"] Oct 01 13:15:00 crc kubenswrapper[4851]: W1001 13:15:00.961647 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b2bfbd_8498_4fb9_9ad3_91ad2de6f40a.slice/crio-4ce20af68e2684d220943388f09c367d851565e34f598c306a803d091577cf57 WatchSource:0}: Error finding container 4ce20af68e2684d220943388f09c367d851565e34f598c306a803d091577cf57: Status 404 returned error can't find the container with id 4ce20af68e2684d220943388f09c367d851565e34f598c306a803d091577cf57 Oct 01 13:15:01 crc kubenswrapper[4851]: I1001 13:15:01.037851 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" event={"ID":"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a","Type":"ContainerStarted","Data":"4ce20af68e2684d220943388f09c367d851565e34f598c306a803d091577cf57"} Oct 01 13:15:01 crc kubenswrapper[4851]: I1001 13:15:01.041762 4851 generic.go:334] "Generic (PLEG): container finished" podID="1a3816a8-ed2c-4f4c-a4ba-485b9182a68e" containerID="dd1647f0669f311c2c2126b67d0997ad5e6912fd8eaf2618076fcd267c28eabb" exitCode=0 Oct 01 13:15:01 crc kubenswrapper[4851]: I1001 13:15:01.041883 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dx6g6" event={"ID":"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e","Type":"ContainerDied","Data":"dd1647f0669f311c2c2126b67d0997ad5e6912fd8eaf2618076fcd267c28eabb"} Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.056230 4851 generic.go:334] "Generic (PLEG): container finished" podID="39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a" containerID="e688f202087fe49bebaf26601d5e50907c99109fcb302411af563698f0864152" exitCode=0 Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.056414 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" event={"ID":"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a","Type":"ContainerDied","Data":"e688f202087fe49bebaf26601d5e50907c99109fcb302411af563698f0864152"} Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.475548 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.603272 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-combined-ca-bundle\") pod \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.603456 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-scripts\") pod \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.603570 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-config-data\") pod \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.604426 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjhkv\" (UniqueName: \"kubernetes.io/projected/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-kube-api-access-tjhkv\") pod \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\" (UID: \"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e\") " Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.610063 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-scripts" (OuterVolumeSpecName: "scripts") pod "1a3816a8-ed2c-4f4c-a4ba-485b9182a68e" (UID: "1a3816a8-ed2c-4f4c-a4ba-485b9182a68e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.611036 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-kube-api-access-tjhkv" (OuterVolumeSpecName: "kube-api-access-tjhkv") pod "1a3816a8-ed2c-4f4c-a4ba-485b9182a68e" (UID: "1a3816a8-ed2c-4f4c-a4ba-485b9182a68e"). InnerVolumeSpecName "kube-api-access-tjhkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.634354 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-config-data" (OuterVolumeSpecName: "config-data") pod "1a3816a8-ed2c-4f4c-a4ba-485b9182a68e" (UID: "1a3816a8-ed2c-4f4c-a4ba-485b9182a68e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.651325 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a3816a8-ed2c-4f4c-a4ba-485b9182a68e" (UID: "1a3816a8-ed2c-4f4c-a4ba-485b9182a68e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.708223 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.708276 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.708297 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:02 crc kubenswrapper[4851]: I1001 13:15:02.708316 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjhkv\" (UniqueName: \"kubernetes.io/projected/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e-kube-api-access-tjhkv\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.073653 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dx6g6" event={"ID":"1a3816a8-ed2c-4f4c-a4ba-485b9182a68e","Type":"ContainerDied","Data":"acc557d8a3ab34beff6079af3e4a320d42d3d19541eeeba34577d9c2d6202fa0"} Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.073713 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acc557d8a3ab34beff6079af3e4a320d42d3d19541eeeba34577d9c2d6202fa0" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.073757 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dx6g6" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.248631 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 13:15:03 crc kubenswrapper[4851]: E1001 13:15:03.249172 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a3816a8-ed2c-4f4c-a4ba-485b9182a68e" containerName="nova-cell0-conductor-db-sync" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.249194 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3816a8-ed2c-4f4c-a4ba-485b9182a68e" containerName="nova-cell0-conductor-db-sync" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.249463 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a3816a8-ed2c-4f4c-a4ba-485b9182a68e" containerName="nova-cell0-conductor-db-sync" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.250389 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.253093 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-8crgs" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.253580 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.264109 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.426742 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7d4479-701c-47c9-92b6-31ea543b479f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1e7d4479-701c-47c9-92b6-31ea543b479f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.426795 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7d4479-701c-47c9-92b6-31ea543b479f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1e7d4479-701c-47c9-92b6-31ea543b479f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.427045 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckjk8\" (UniqueName: \"kubernetes.io/projected/1e7d4479-701c-47c9-92b6-31ea543b479f-kube-api-access-ckjk8\") pod \"nova-cell0-conductor-0\" (UID: \"1e7d4479-701c-47c9-92b6-31ea543b479f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.520794 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.528647 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckjk8\" (UniqueName: \"kubernetes.io/projected/1e7d4479-701c-47c9-92b6-31ea543b479f-kube-api-access-ckjk8\") pod \"nova-cell0-conductor-0\" (UID: \"1e7d4479-701c-47c9-92b6-31ea543b479f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.528828 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7d4479-701c-47c9-92b6-31ea543b479f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1e7d4479-701c-47c9-92b6-31ea543b479f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.528982 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7d4479-701c-47c9-92b6-31ea543b479f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1e7d4479-701c-47c9-92b6-31ea543b479f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.533207 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7d4479-701c-47c9-92b6-31ea543b479f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1e7d4479-701c-47c9-92b6-31ea543b479f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.534807 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7d4479-701c-47c9-92b6-31ea543b479f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1e7d4479-701c-47c9-92b6-31ea543b479f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.551051 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckjk8\" (UniqueName: \"kubernetes.io/projected/1e7d4479-701c-47c9-92b6-31ea543b479f-kube-api-access-ckjk8\") pod \"nova-cell0-conductor-0\" (UID: \"1e7d4479-701c-47c9-92b6-31ea543b479f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.570759 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.629921 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-secret-volume\") pod \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\" (UID: \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\") " Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.630529 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4v8w\" (UniqueName: \"kubernetes.io/projected/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-kube-api-access-h4v8w\") pod \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\" (UID: \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\") " Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.630737 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-config-volume\") pod \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\" (UID: \"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a\") " Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.631348 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-config-volume" (OuterVolumeSpecName: "config-volume") pod "39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a" (UID: "39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.635811 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a" (UID: "39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.636452 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-kube-api-access-h4v8w" (OuterVolumeSpecName: "kube-api-access-h4v8w") pod "39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a" (UID: "39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a"). InnerVolumeSpecName "kube-api-access-h4v8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.733137 4851 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.733196 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4v8w\" (UniqueName: \"kubernetes.io/projected/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-kube-api-access-h4v8w\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.733219 4851 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:03 crc kubenswrapper[4851]: I1001 13:15:03.860185 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 13:15:03 crc kubenswrapper[4851]: W1001 13:15:03.863579 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e7d4479_701c_47c9_92b6_31ea543b479f.slice/crio-3360cf9c69733ae23e34268f95fe5bedb22477a6c77a039add8e8fbb69c26ed1 WatchSource:0}: Error finding container 3360cf9c69733ae23e34268f95fe5bedb22477a6c77a039add8e8fbb69c26ed1: Status 404 returned error can't find the container with id 3360cf9c69733ae23e34268f95fe5bedb22477a6c77a039add8e8fbb69c26ed1 Oct 01 13:15:04 crc kubenswrapper[4851]: I1001 13:15:04.085546 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" event={"ID":"39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a","Type":"ContainerDied","Data":"4ce20af68e2684d220943388f09c367d851565e34f598c306a803d091577cf57"} Oct 01 13:15:04 crc kubenswrapper[4851]: I1001 13:15:04.085828 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ce20af68e2684d220943388f09c367d851565e34f598c306a803d091577cf57" Oct 01 13:15:04 crc kubenswrapper[4851]: I1001 13:15:04.085606 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b" Oct 01 13:15:04 crc kubenswrapper[4851]: I1001 13:15:04.087756 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1e7d4479-701c-47c9-92b6-31ea543b479f","Type":"ContainerStarted","Data":"b40cc176c84f51edd0e0ba14367da30945afbc33f290e390e63caafdcd1f4417"} Oct 01 13:15:04 crc kubenswrapper[4851]: I1001 13:15:04.087829 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1e7d4479-701c-47c9-92b6-31ea543b479f","Type":"ContainerStarted","Data":"3360cf9c69733ae23e34268f95fe5bedb22477a6c77a039add8e8fbb69c26ed1"} Oct 01 13:15:04 crc kubenswrapper[4851]: I1001 13:15:04.088066 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 01 13:15:04 crc kubenswrapper[4851]: I1001 13:15:04.329769 4851 scope.go:117] "RemoveContainer" containerID="a68fbdceade2850c22adba2de4b6078e94625dcade81e98faf63ffd294964ee5" Oct 01 13:15:04 crc kubenswrapper[4851]: I1001 13:15:04.560167 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.560141912 podStartE2EDuration="1.560141912s" podCreationTimestamp="2025-10-01 13:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:15:04.115394223 +0000 UTC m=+1312.460511759" watchObservedRunningTime="2025-10-01 13:15:04.560141912 +0000 UTC m=+1312.905259408" Oct 01 13:15:05 crc kubenswrapper[4851]: I1001 13:15:05.100484 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"971ca0ac-6de7-42f1-bf29-5174fd80ced4","Type":"ContainerStarted","Data":"249cf7f49b57cac785ce282b8b69156e972d25a3c1b1f64fb9b059b6c34b58ec"} Oct 01 13:15:05 crc kubenswrapper[4851]: I1001 13:15:05.247319 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:15:05 crc kubenswrapper[4851]: I1001 13:15:05.288003 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 01 13:15:06 crc kubenswrapper[4851]: I1001 13:15:06.115232 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 01 13:15:06 crc kubenswrapper[4851]: I1001 13:15:06.179388 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 01 13:15:13 crc kubenswrapper[4851]: I1001 13:15:13.610758 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.157769 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-7shqh"] Oct 01 13:15:14 crc kubenswrapper[4851]: E1001 13:15:14.158194 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a" containerName="collect-profiles" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.158213 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a" containerName="collect-profiles" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.158394 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a" containerName="collect-profiles" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.159100 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.162087 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.162683 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.179127 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-config-data\") pod \"nova-cell0-cell-mapping-7shqh\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.179172 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjmfk\" (UniqueName: \"kubernetes.io/projected/ed619e77-8ae0-4eaf-8c68-611b8883f603-kube-api-access-cjmfk\") pod \"nova-cell0-cell-mapping-7shqh\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.179206 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-scripts\") pod \"nova-cell0-cell-mapping-7shqh\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.179242 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7shqh"] Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.179456 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7shqh\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.283648 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-config-data\") pod \"nova-cell0-cell-mapping-7shqh\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.283706 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjmfk\" (UniqueName: \"kubernetes.io/projected/ed619e77-8ae0-4eaf-8c68-611b8883f603-kube-api-access-cjmfk\") pod \"nova-cell0-cell-mapping-7shqh\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.283740 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-scripts\") pod \"nova-cell0-cell-mapping-7shqh\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.283805 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7shqh\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.291224 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-scripts\") pod \"nova-cell0-cell-mapping-7shqh\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.292155 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-config-data\") pod \"nova-cell0-cell-mapping-7shqh\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.301152 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7shqh\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.323994 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjmfk\" (UniqueName: \"kubernetes.io/projected/ed619e77-8ae0-4eaf-8c68-611b8883f603-kube-api-access-cjmfk\") pod \"nova-cell0-cell-mapping-7shqh\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.406577 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.408084 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.416639 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.417699 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.494963 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.495477 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr2bv\" (UniqueName: \"kubernetes.io/projected/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-kube-api-access-nr2bv\") pod \"nova-cell1-novncproxy-0\" (UID: \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.495795 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.501906 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.559559 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.560885 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.568717 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.573110 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.574945 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.578012 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.591017 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.638556 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.660148 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.660237 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d57440b-c82a-4b73-82ae-f9a432089203-config-data\") pod \"nova-scheduler-0\" (UID: \"1d57440b-c82a-4b73-82ae-f9a432089203\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.660306 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce33484-970b-4326-b51a-899bfaee9727-logs\") pod \"nova-metadata-0\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " pod="openstack/nova-metadata-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.660360 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbkcn\" (UniqueName: \"kubernetes.io/projected/4ce33484-970b-4326-b51a-899bfaee9727-kube-api-access-cbkcn\") pod \"nova-metadata-0\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " pod="openstack/nova-metadata-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.660458 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.660474 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfs24\" (UniqueName: \"kubernetes.io/projected/1d57440b-c82a-4b73-82ae-f9a432089203-kube-api-access-xfs24\") pod \"nova-scheduler-0\" (UID: \"1d57440b-c82a-4b73-82ae-f9a432089203\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.660524 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d57440b-c82a-4b73-82ae-f9a432089203-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1d57440b-c82a-4b73-82ae-f9a432089203\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.660633 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr2bv\" (UniqueName: \"kubernetes.io/projected/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-kube-api-access-nr2bv\") pod \"nova-cell1-novncproxy-0\" (UID: \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.673492 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce33484-970b-4326-b51a-899bfaee9727-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " pod="openstack/nova-metadata-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.673582 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce33484-970b-4326-b51a-899bfaee9727-config-data\") pod \"nova-metadata-0\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " pod="openstack/nova-metadata-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.697262 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.698948 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.722186 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr2bv\" (UniqueName: \"kubernetes.io/projected/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-kube-api-access-nr2bv\") pod \"nova-cell1-novncproxy-0\" (UID: \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.763958 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.777551 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbkcn\" (UniqueName: \"kubernetes.io/projected/4ce33484-970b-4326-b51a-899bfaee9727-kube-api-access-cbkcn\") pod \"nova-metadata-0\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " pod="openstack/nova-metadata-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.777868 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfs24\" (UniqueName: \"kubernetes.io/projected/1d57440b-c82a-4b73-82ae-f9a432089203-kube-api-access-xfs24\") pod \"nova-scheduler-0\" (UID: \"1d57440b-c82a-4b73-82ae-f9a432089203\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.777962 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d57440b-c82a-4b73-82ae-f9a432089203-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1d57440b-c82a-4b73-82ae-f9a432089203\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.778131 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce33484-970b-4326-b51a-899bfaee9727-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " pod="openstack/nova-metadata-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.778252 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce33484-970b-4326-b51a-899bfaee9727-config-data\") pod \"nova-metadata-0\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " pod="openstack/nova-metadata-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.778374 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d57440b-c82a-4b73-82ae-f9a432089203-config-data\") pod \"nova-scheduler-0\" (UID: \"1d57440b-c82a-4b73-82ae-f9a432089203\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.778551 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce33484-970b-4326-b51a-899bfaee9727-logs\") pod \"nova-metadata-0\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " pod="openstack/nova-metadata-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.782365 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c4f994dc9-lkgs9"] Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.782592 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d57440b-c82a-4b73-82ae-f9a432089203-config-data\") pod \"nova-scheduler-0\" (UID: \"1d57440b-c82a-4b73-82ae-f9a432089203\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.782626 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce33484-970b-4326-b51a-899bfaee9727-logs\") pod \"nova-metadata-0\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " pod="openstack/nova-metadata-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.789380 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce33484-970b-4326-b51a-899bfaee9727-config-data\") pod \"nova-metadata-0\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " pod="openstack/nova-metadata-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.793691 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce33484-970b-4326-b51a-899bfaee9727-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " pod="openstack/nova-metadata-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.794684 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.795688 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d57440b-c82a-4b73-82ae-f9a432089203-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1d57440b-c82a-4b73-82ae-f9a432089203\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.804674 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.807598 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.809337 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.809820 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfs24\" (UniqueName: \"kubernetes.io/projected/1d57440b-c82a-4b73-82ae-f9a432089203-kube-api-access-xfs24\") pod \"nova-scheduler-0\" (UID: \"1d57440b-c82a-4b73-82ae-f9a432089203\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.822570 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbkcn\" (UniqueName: \"kubernetes.io/projected/4ce33484-970b-4326-b51a-899bfaee9727-kube-api-access-cbkcn\") pod \"nova-metadata-0\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " pod="openstack/nova-metadata-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.831273 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.881664 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.881964 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.882039 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbdp6\" (UniqueName: \"kubernetes.io/projected/6db851a0-f990-44cf-8766-ef6891037e3a-kube-api-access-mbdp6\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.882125 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.882224 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-dns-svc\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.882245 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-config\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.884552 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c4f994dc9-lkgs9"] Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.990714 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e85f88e-bad5-4b06-8767-b1720bacf2ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " pod="openstack/nova-api-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.990829 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e85f88e-bad5-4b06-8767-b1720bacf2ea-config-data\") pod \"nova-api-0\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " pod="openstack/nova-api-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.990882 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.990902 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e85f88e-bad5-4b06-8767-b1720bacf2ea-logs\") pod \"nova-api-0\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " pod="openstack/nova-api-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.995088 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh7jh\" (UniqueName: \"kubernetes.io/projected/1e85f88e-bad5-4b06-8767-b1720bacf2ea-kube-api-access-mh7jh\") pod \"nova-api-0\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " pod="openstack/nova-api-0" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.995130 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.995176 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbdp6\" (UniqueName: \"kubernetes.io/projected/6db851a0-f990-44cf-8766-ef6891037e3a-kube-api-access-mbdp6\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.995225 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.995304 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-dns-svc\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.995330 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-config\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.996146 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-config\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:14 crc kubenswrapper[4851]: I1001 13:15:14.994881 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.001330 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.001875 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-dns-svc\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.002183 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.019586 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbdp6\" (UniqueName: \"kubernetes.io/projected/6db851a0-f990-44cf-8766-ef6891037e3a-kube-api-access-mbdp6\") pod \"dnsmasq-dns-6c4f994dc9-lkgs9\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.078046 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.100551 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e85f88e-bad5-4b06-8767-b1720bacf2ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " pod="openstack/nova-api-0" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.100616 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e85f88e-bad5-4b06-8767-b1720bacf2ea-config-data\") pod \"nova-api-0\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " pod="openstack/nova-api-0" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.100653 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e85f88e-bad5-4b06-8767-b1720bacf2ea-logs\") pod \"nova-api-0\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " pod="openstack/nova-api-0" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.100712 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh7jh\" (UniqueName: \"kubernetes.io/projected/1e85f88e-bad5-4b06-8767-b1720bacf2ea-kube-api-access-mh7jh\") pod \"nova-api-0\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " pod="openstack/nova-api-0" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.105300 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.106048 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e85f88e-bad5-4b06-8767-b1720bacf2ea-logs\") pod \"nova-api-0\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " pod="openstack/nova-api-0" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.114147 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e85f88e-bad5-4b06-8767-b1720bacf2ea-config-data\") pod \"nova-api-0\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " pod="openstack/nova-api-0" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.118205 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e85f88e-bad5-4b06-8767-b1720bacf2ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " pod="openstack/nova-api-0" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.118659 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh7jh\" (UniqueName: \"kubernetes.io/projected/1e85f88e-bad5-4b06-8767-b1720bacf2ea-kube-api-access-mh7jh\") pod \"nova-api-0\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " pod="openstack/nova-api-0" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.174826 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.200659 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:15:15 crc kubenswrapper[4851]: W1001 13:15:15.310174 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded619e77_8ae0_4eaf_8c68_611b8883f603.slice/crio-e9687405733e53e25d89dce15fec93c4a226014fd865ecffdbb050fae86edfce WatchSource:0}: Error finding container e9687405733e53e25d89dce15fec93c4a226014fd865ecffdbb050fae86edfce: Status 404 returned error can't find the container with id e9687405733e53e25d89dce15fec93c4a226014fd865ecffdbb050fae86edfce Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.311869 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7shqh"] Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.386349 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.563204 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-q4cn9"] Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.566399 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.569171 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.570061 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.582013 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-q4cn9"] Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.728044 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d6tw\" (UniqueName: \"kubernetes.io/projected/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-kube-api-access-2d6tw\") pod \"nova-cell1-conductor-db-sync-q4cn9\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.728096 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-scripts\") pod \"nova-cell1-conductor-db-sync-q4cn9\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.728162 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-config-data\") pod \"nova-cell1-conductor-db-sync-q4cn9\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.728367 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-q4cn9\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.830446 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-config-data\") pod \"nova-cell1-conductor-db-sync-q4cn9\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.830567 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-q4cn9\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.830668 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d6tw\" (UniqueName: \"kubernetes.io/projected/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-kube-api-access-2d6tw\") pod \"nova-cell1-conductor-db-sync-q4cn9\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.830695 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-scripts\") pod \"nova-cell1-conductor-db-sync-q4cn9\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.836419 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-q4cn9\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.837725 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-config-data\") pod \"nova-cell1-conductor-db-sync-q4cn9\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.841541 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-scripts\") pod \"nova-cell1-conductor-db-sync-q4cn9\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.847011 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d6tw\" (UniqueName: \"kubernetes.io/projected/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-kube-api-access-2d6tw\") pod \"nova-cell1-conductor-db-sync-q4cn9\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.925875 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:15:15 crc kubenswrapper[4851]: W1001 13:15:15.932942 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d57440b_c82a_4b73_82ae_f9a432089203.slice/crio-c31f4ff6a04e143d77295c65dc99eb274e725f272e14d23a67557303725b2f48 WatchSource:0}: Error finding container c31f4ff6a04e143d77295c65dc99eb274e725f272e14d23a67557303725b2f48: Status 404 returned error can't find the container with id c31f4ff6a04e143d77295c65dc99eb274e725f272e14d23a67557303725b2f48 Oct 01 13:15:15 crc kubenswrapper[4851]: I1001 13:15:15.945093 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:16 crc kubenswrapper[4851]: I1001 13:15:16.057601 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:15:16 crc kubenswrapper[4851]: I1001 13:15:16.070799 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c4f994dc9-lkgs9"] Oct 01 13:15:16 crc kubenswrapper[4851]: I1001 13:15:16.082531 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:16 crc kubenswrapper[4851]: W1001 13:15:16.084025 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e85f88e_bad5_4b06_8767_b1720bacf2ea.slice/crio-8a60607b540237f872f52dcee09e602d3da695b95b52af858ceb98bac6f7272d WatchSource:0}: Error finding container 8a60607b540237f872f52dcee09e602d3da695b95b52af858ceb98bac6f7272d: Status 404 returned error can't find the container with id 8a60607b540237f872f52dcee09e602d3da695b95b52af858ceb98bac6f7272d Oct 01 13:15:16 crc kubenswrapper[4851]: I1001 13:15:16.247027 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7shqh" event={"ID":"ed619e77-8ae0-4eaf-8c68-611b8883f603","Type":"ContainerStarted","Data":"6a05f3b09fa8ebef756d7566475da11afa25007ecf5b1addb8f181475dbd6894"} Oct 01 13:15:16 crc kubenswrapper[4851]: I1001 13:15:16.247373 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7shqh" event={"ID":"ed619e77-8ae0-4eaf-8c68-611b8883f603","Type":"ContainerStarted","Data":"e9687405733e53e25d89dce15fec93c4a226014fd865ecffdbb050fae86edfce"} Oct 01 13:15:16 crc kubenswrapper[4851]: I1001 13:15:16.248992 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4ce33484-970b-4326-b51a-899bfaee9727","Type":"ContainerStarted","Data":"a57364172ce8ba432037e9d09e79bc456f6aa94df1f3bd967e61d140702d8531"} Oct 01 13:15:16 crc kubenswrapper[4851]: I1001 13:15:16.251041 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e85f88e-bad5-4b06-8767-b1720bacf2ea","Type":"ContainerStarted","Data":"8a60607b540237f872f52dcee09e602d3da695b95b52af858ceb98bac6f7272d"} Oct 01 13:15:16 crc kubenswrapper[4851]: I1001 13:15:16.253057 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1d57440b-c82a-4b73-82ae-f9a432089203","Type":"ContainerStarted","Data":"c31f4ff6a04e143d77295c65dc99eb274e725f272e14d23a67557303725b2f48"} Oct 01 13:15:16 crc kubenswrapper[4851]: I1001 13:15:16.255871 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"16a3f1fc-2d6b-4d3a-95c3-b36258749d53","Type":"ContainerStarted","Data":"ef3825674fc0274179c024445cef7fe57746075815a0e6f5ec5ac603c63de0ee"} Oct 01 13:15:16 crc kubenswrapper[4851]: I1001 13:15:16.258730 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" event={"ID":"6db851a0-f990-44cf-8766-ef6891037e3a","Type":"ContainerStarted","Data":"47e6ed57dbd8c6e3383c2752c6d06ecdc82460f3a1428a81add15357187e15fa"} Oct 01 13:15:16 crc kubenswrapper[4851]: I1001 13:15:16.275793 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-7shqh" podStartSLOduration=2.275775841 podStartE2EDuration="2.275775841s" podCreationTimestamp="2025-10-01 13:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:15:16.271137705 +0000 UTC m=+1324.616255201" watchObservedRunningTime="2025-10-01 13:15:16.275775841 +0000 UTC m=+1324.620893327" Oct 01 13:15:16 crc kubenswrapper[4851]: I1001 13:15:16.499850 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-q4cn9"] Oct 01 13:15:17 crc kubenswrapper[4851]: I1001 13:15:17.275610 4851 generic.go:334] "Generic (PLEG): container finished" podID="6db851a0-f990-44cf-8766-ef6891037e3a" containerID="e7094989432d343cdf41b68eb5a16f64934aae57c5926f1777802cf768a8932b" exitCode=0 Oct 01 13:15:17 crc kubenswrapper[4851]: I1001 13:15:17.275697 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" event={"ID":"6db851a0-f990-44cf-8766-ef6891037e3a","Type":"ContainerDied","Data":"e7094989432d343cdf41b68eb5a16f64934aae57c5926f1777802cf768a8932b"} Oct 01 13:15:18 crc kubenswrapper[4851]: I1001 13:15:18.058871 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:18 crc kubenswrapper[4851]: I1001 13:15:18.073591 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:15:18 crc kubenswrapper[4851]: W1001 13:15:18.301374 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a092c6d_ccb7_4c6f_b159_d7d6f6b53b7b.slice/crio-d8cb411493f6acb9ff39004fbcf3e39a1d996238b7a1c7e010c25d22bfbe03e0 WatchSource:0}: Error finding container d8cb411493f6acb9ff39004fbcf3e39a1d996238b7a1c7e010c25d22bfbe03e0: Status 404 returned error can't find the container with id d8cb411493f6acb9ff39004fbcf3e39a1d996238b7a1c7e010c25d22bfbe03e0 Oct 01 13:15:19 crc kubenswrapper[4851]: I1001 13:15:19.298814 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="16a3f1fc-2d6b-4d3a-95c3-b36258749d53" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0bca02248dc1c251e8985412a6a8c29bc91aaed972b3ba3d5795a5fa1e8b1be5" gracePeriod=30 Oct 01 13:15:19 crc kubenswrapper[4851]: I1001 13:15:19.299261 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"16a3f1fc-2d6b-4d3a-95c3-b36258749d53","Type":"ContainerStarted","Data":"0bca02248dc1c251e8985412a6a8c29bc91aaed972b3ba3d5795a5fa1e8b1be5"} Oct 01 13:15:19 crc kubenswrapper[4851]: I1001 13:15:19.311629 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" event={"ID":"6db851a0-f990-44cf-8766-ef6891037e3a","Type":"ContainerStarted","Data":"b037c68ef909e110d913e388097f52e92456ceaa1bfae19d7a32debb805a6fec"} Oct 01 13:15:19 crc kubenswrapper[4851]: I1001 13:15:19.316320 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:19 crc kubenswrapper[4851]: I1001 13:15:19.319796 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.9249595080000002 podStartE2EDuration="5.319784707s" podCreationTimestamp="2025-10-01 13:15:14 +0000 UTC" firstStartedPulling="2025-10-01 13:15:15.459278268 +0000 UTC m=+1323.804395754" lastFinishedPulling="2025-10-01 13:15:18.854103477 +0000 UTC m=+1327.199220953" observedRunningTime="2025-10-01 13:15:19.31953672 +0000 UTC m=+1327.664654216" watchObservedRunningTime="2025-10-01 13:15:19.319784707 +0000 UTC m=+1327.664902193" Oct 01 13:15:19 crc kubenswrapper[4851]: I1001 13:15:19.324273 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-q4cn9" event={"ID":"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b","Type":"ContainerStarted","Data":"37fdd75b45485c986f60c17b2c07beddfef2a71d34a715c47676888c61459a4f"} Oct 01 13:15:19 crc kubenswrapper[4851]: I1001 13:15:19.324344 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-q4cn9" event={"ID":"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b","Type":"ContainerStarted","Data":"d8cb411493f6acb9ff39004fbcf3e39a1d996238b7a1c7e010c25d22bfbe03e0"} Oct 01 13:15:19 crc kubenswrapper[4851]: I1001 13:15:19.333813 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4ce33484-970b-4326-b51a-899bfaee9727","Type":"ContainerStarted","Data":"06c979b668da1e00d28ccd717ae42bbea437ae4bd26390f5e584a4380d42fe20"} Oct 01 13:15:19 crc kubenswrapper[4851]: I1001 13:15:19.341785 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e85f88e-bad5-4b06-8767-b1720bacf2ea","Type":"ContainerStarted","Data":"29bdd0f6232c950c532fbfd0eeb68ef456e736441d843c146e37b969bc59871e"} Oct 01 13:15:19 crc kubenswrapper[4851]: I1001 13:15:19.345121 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1d57440b-c82a-4b73-82ae-f9a432089203","Type":"ContainerStarted","Data":"5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6"} Oct 01 13:15:19 crc kubenswrapper[4851]: I1001 13:15:19.349329 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" podStartSLOduration=5.349307361 podStartE2EDuration="5.349307361s" podCreationTimestamp="2025-10-01 13:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:15:19.341159529 +0000 UTC m=+1327.686277015" watchObservedRunningTime="2025-10-01 13:15:19.349307361 +0000 UTC m=+1327.694424847" Oct 01 13:15:19 crc kubenswrapper[4851]: I1001 13:15:19.367356 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-q4cn9" podStartSLOduration=4.367335512 podStartE2EDuration="4.367335512s" podCreationTimestamp="2025-10-01 13:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:15:19.356922758 +0000 UTC m=+1327.702040254" watchObservedRunningTime="2025-10-01 13:15:19.367335512 +0000 UTC m=+1327.712452998" Oct 01 13:15:19 crc kubenswrapper[4851]: I1001 13:15:19.382056 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.455196527 podStartE2EDuration="5.382037772s" podCreationTimestamp="2025-10-01 13:15:14 +0000 UTC" firstStartedPulling="2025-10-01 13:15:15.935167667 +0000 UTC m=+1324.280285153" lastFinishedPulling="2025-10-01 13:15:18.862008912 +0000 UTC m=+1327.207126398" observedRunningTime="2025-10-01 13:15:19.373477839 +0000 UTC m=+1327.718595345" watchObservedRunningTime="2025-10-01 13:15:19.382037772 +0000 UTC m=+1327.727155258" Oct 01 13:15:19 crc kubenswrapper[4851]: I1001 13:15:19.765791 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:20 crc kubenswrapper[4851]: I1001 13:15:20.078762 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 13:15:20 crc kubenswrapper[4851]: I1001 13:15:20.377992 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4ce33484-970b-4326-b51a-899bfaee9727","Type":"ContainerStarted","Data":"7ba5c0c8fb0fc46e0498ab47657e00c7647baacb0fd236555bc50d5564816bf8"} Oct 01 13:15:20 crc kubenswrapper[4851]: I1001 13:15:20.378045 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4ce33484-970b-4326-b51a-899bfaee9727" containerName="nova-metadata-log" containerID="cri-o://06c979b668da1e00d28ccd717ae42bbea437ae4bd26390f5e584a4380d42fe20" gracePeriod=30 Oct 01 13:15:20 crc kubenswrapper[4851]: I1001 13:15:20.379186 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4ce33484-970b-4326-b51a-899bfaee9727" containerName="nova-metadata-metadata" containerID="cri-o://7ba5c0c8fb0fc46e0498ab47657e00c7647baacb0fd236555bc50d5564816bf8" gracePeriod=30 Oct 01 13:15:20 crc kubenswrapper[4851]: I1001 13:15:20.386061 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e85f88e-bad5-4b06-8767-b1720bacf2ea","Type":"ContainerStarted","Data":"8caa28e34abe628c54bb4e1a0c9785f56476f4c1eb2b783560c3917b4ffc54cc"} Oct 01 13:15:20 crc kubenswrapper[4851]: I1001 13:15:20.436469 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.64373185 podStartE2EDuration="6.436438923s" podCreationTimestamp="2025-10-01 13:15:14 +0000 UTC" firstStartedPulling="2025-10-01 13:15:16.13726648 +0000 UTC m=+1324.482383966" lastFinishedPulling="2025-10-01 13:15:18.929973553 +0000 UTC m=+1327.275091039" observedRunningTime="2025-10-01 13:15:20.419232205 +0000 UTC m=+1328.764349751" watchObservedRunningTime="2025-10-01 13:15:20.436438923 +0000 UTC m=+1328.781556449" Oct 01 13:15:20 crc kubenswrapper[4851]: I1001 13:15:20.463113 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.691286495 podStartE2EDuration="6.463086949s" podCreationTimestamp="2025-10-01 13:15:14 +0000 UTC" firstStartedPulling="2025-10-01 13:15:16.09027844 +0000 UTC m=+1324.435395926" lastFinishedPulling="2025-10-01 13:15:18.862078894 +0000 UTC m=+1327.207196380" observedRunningTime="2025-10-01 13:15:20.443104515 +0000 UTC m=+1328.788222011" watchObservedRunningTime="2025-10-01 13:15:20.463086949 +0000 UTC m=+1328.808204475" Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.402117 4851 generic.go:334] "Generic (PLEG): container finished" podID="4ce33484-970b-4326-b51a-899bfaee9727" containerID="7ba5c0c8fb0fc46e0498ab47657e00c7647baacb0fd236555bc50d5564816bf8" exitCode=0 Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.402464 4851 generic.go:334] "Generic (PLEG): container finished" podID="4ce33484-970b-4326-b51a-899bfaee9727" containerID="06c979b668da1e00d28ccd717ae42bbea437ae4bd26390f5e584a4380d42fe20" exitCode=143 Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.402187 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4ce33484-970b-4326-b51a-899bfaee9727","Type":"ContainerDied","Data":"7ba5c0c8fb0fc46e0498ab47657e00c7647baacb0fd236555bc50d5564816bf8"} Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.403690 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4ce33484-970b-4326-b51a-899bfaee9727","Type":"ContainerDied","Data":"06c979b668da1e00d28ccd717ae42bbea437ae4bd26390f5e584a4380d42fe20"} Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.564167 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.682747 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce33484-970b-4326-b51a-899bfaee9727-logs\") pod \"4ce33484-970b-4326-b51a-899bfaee9727\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.683300 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce33484-970b-4326-b51a-899bfaee9727-logs" (OuterVolumeSpecName: "logs") pod "4ce33484-970b-4326-b51a-899bfaee9727" (UID: "4ce33484-970b-4326-b51a-899bfaee9727"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.683365 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce33484-970b-4326-b51a-899bfaee9727-config-data\") pod \"4ce33484-970b-4326-b51a-899bfaee9727\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.684070 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce33484-970b-4326-b51a-899bfaee9727-combined-ca-bundle\") pod \"4ce33484-970b-4326-b51a-899bfaee9727\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.684150 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbkcn\" (UniqueName: \"kubernetes.io/projected/4ce33484-970b-4326-b51a-899bfaee9727-kube-api-access-cbkcn\") pod \"4ce33484-970b-4326-b51a-899bfaee9727\" (UID: \"4ce33484-970b-4326-b51a-899bfaee9727\") " Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.685231 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce33484-970b-4326-b51a-899bfaee9727-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.711482 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce33484-970b-4326-b51a-899bfaee9727-kube-api-access-cbkcn" (OuterVolumeSpecName: "kube-api-access-cbkcn") pod "4ce33484-970b-4326-b51a-899bfaee9727" (UID: "4ce33484-970b-4326-b51a-899bfaee9727"). InnerVolumeSpecName "kube-api-access-cbkcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.718986 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce33484-970b-4326-b51a-899bfaee9727-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ce33484-970b-4326-b51a-899bfaee9727" (UID: "4ce33484-970b-4326-b51a-899bfaee9727"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.720395 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce33484-970b-4326-b51a-899bfaee9727-config-data" (OuterVolumeSpecName: "config-data") pod "4ce33484-970b-4326-b51a-899bfaee9727" (UID: "4ce33484-970b-4326-b51a-899bfaee9727"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.786538 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce33484-970b-4326-b51a-899bfaee9727-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.786658 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbkcn\" (UniqueName: \"kubernetes.io/projected/4ce33484-970b-4326-b51a-899bfaee9727-kube-api-access-cbkcn\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:21 crc kubenswrapper[4851]: I1001 13:15:21.786725 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce33484-970b-4326-b51a-899bfaee9727-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.416587 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4ce33484-970b-4326-b51a-899bfaee9727","Type":"ContainerDied","Data":"a57364172ce8ba432037e9d09e79bc456f6aa94df1f3bd967e61d140702d8531"} Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.416644 4851 scope.go:117] "RemoveContainer" containerID="7ba5c0c8fb0fc46e0498ab47657e00c7647baacb0fd236555bc50d5564816bf8" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.416641 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.464108 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.471063 4851 scope.go:117] "RemoveContainer" containerID="06c979b668da1e00d28ccd717ae42bbea437ae4bd26390f5e584a4380d42fe20" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.477730 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.501223 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:22 crc kubenswrapper[4851]: E1001 13:15:22.501932 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce33484-970b-4326-b51a-899bfaee9727" containerName="nova-metadata-log" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.501960 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce33484-970b-4326-b51a-899bfaee9727" containerName="nova-metadata-log" Oct 01 13:15:22 crc kubenswrapper[4851]: E1001 13:15:22.502025 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce33484-970b-4326-b51a-899bfaee9727" containerName="nova-metadata-metadata" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.502038 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce33484-970b-4326-b51a-899bfaee9727" containerName="nova-metadata-metadata" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.502370 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce33484-970b-4326-b51a-899bfaee9727" containerName="nova-metadata-metadata" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.502404 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce33484-970b-4326-b51a-899bfaee9727" containerName="nova-metadata-log" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.504361 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.508216 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.508404 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.517695 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.602384 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.602455 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea8c682-c707-4a3c-a543-87fd2b68c9ef-logs\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.602607 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.602636 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-config-data\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.602687 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkdwj\" (UniqueName: \"kubernetes.io/projected/cea8c682-c707-4a3c-a543-87fd2b68c9ef-kube-api-access-kkdwj\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.704709 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.704785 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea8c682-c707-4a3c-a543-87fd2b68c9ef-logs\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.704876 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.704911 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-config-data\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.704983 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkdwj\" (UniqueName: \"kubernetes.io/projected/cea8c682-c707-4a3c-a543-87fd2b68c9ef-kube-api-access-kkdwj\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.705473 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea8c682-c707-4a3c-a543-87fd2b68c9ef-logs\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.708917 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.709104 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-config-data\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.711714 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.735967 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkdwj\" (UniqueName: \"kubernetes.io/projected/cea8c682-c707-4a3c-a543-87fd2b68c9ef-kube-api-access-kkdwj\") pod \"nova-metadata-0\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " pod="openstack/nova-metadata-0" Oct 01 13:15:22 crc kubenswrapper[4851]: I1001 13:15:22.827937 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:15:23 crc kubenswrapper[4851]: I1001 13:15:23.369277 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:23 crc kubenswrapper[4851]: W1001 13:15:23.388672 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcea8c682_c707_4a3c_a543_87fd2b68c9ef.slice/crio-6bf8d812bb3bccef75d59caea8afc20931654b1a057fa6af70da94287787fa2a WatchSource:0}: Error finding container 6bf8d812bb3bccef75d59caea8afc20931654b1a057fa6af70da94287787fa2a: Status 404 returned error can't find the container with id 6bf8d812bb3bccef75d59caea8afc20931654b1a057fa6af70da94287787fa2a Oct 01 13:15:23 crc kubenswrapper[4851]: I1001 13:15:23.431644 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea8c682-c707-4a3c-a543-87fd2b68c9ef","Type":"ContainerStarted","Data":"6bf8d812bb3bccef75d59caea8afc20931654b1a057fa6af70da94287787fa2a"} Oct 01 13:15:24 crc kubenswrapper[4851]: I1001 13:15:24.351325 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce33484-970b-4326-b51a-899bfaee9727" path="/var/lib/kubelet/pods/4ce33484-970b-4326-b51a-899bfaee9727/volumes" Oct 01 13:15:24 crc kubenswrapper[4851]: I1001 13:15:24.449360 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea8c682-c707-4a3c-a543-87fd2b68c9ef","Type":"ContainerStarted","Data":"254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a"} Oct 01 13:15:24 crc kubenswrapper[4851]: I1001 13:15:24.449424 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea8c682-c707-4a3c-a543-87fd2b68c9ef","Type":"ContainerStarted","Data":"220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7"} Oct 01 13:15:24 crc kubenswrapper[4851]: I1001 13:15:24.451560 4851 generic.go:334] "Generic (PLEG): container finished" podID="ed619e77-8ae0-4eaf-8c68-611b8883f603" containerID="6a05f3b09fa8ebef756d7566475da11afa25007ecf5b1addb8f181475dbd6894" exitCode=0 Oct 01 13:15:24 crc kubenswrapper[4851]: I1001 13:15:24.451593 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7shqh" event={"ID":"ed619e77-8ae0-4eaf-8c68-611b8883f603","Type":"ContainerDied","Data":"6a05f3b09fa8ebef756d7566475da11afa25007ecf5b1addb8f181475dbd6894"} Oct 01 13:15:24 crc kubenswrapper[4851]: I1001 13:15:24.479879 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.4798599230000002 podStartE2EDuration="2.479859923s" podCreationTimestamp="2025-10-01 13:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:15:24.46984604 +0000 UTC m=+1332.814963566" watchObservedRunningTime="2025-10-01 13:15:24.479859923 +0000 UTC m=+1332.824977419" Oct 01 13:15:25 crc kubenswrapper[4851]: I1001 13:15:25.079216 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 13:15:25 crc kubenswrapper[4851]: I1001 13:15:25.147192 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 13:15:25 crc kubenswrapper[4851]: I1001 13:15:25.178778 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:15:25 crc kubenswrapper[4851]: I1001 13:15:25.201389 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:15:25 crc kubenswrapper[4851]: I1001 13:15:25.201457 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:15:25 crc kubenswrapper[4851]: I1001 13:15:25.264848 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-788dbb8495-dp5gr"] Oct 01 13:15:25 crc kubenswrapper[4851]: I1001 13:15:25.265154 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" podUID="a82d9b2c-d132-4496-92f6-9fea17968b18" containerName="dnsmasq-dns" containerID="cri-o://4fa2acb1d0369e23bb7b9f896175d9a71261580a50338f3228baddf6bc79fee1" gracePeriod=10 Oct 01 13:15:25 crc kubenswrapper[4851]: I1001 13:15:25.479761 4851 generic.go:334] "Generic (PLEG): container finished" podID="a82d9b2c-d132-4496-92f6-9fea17968b18" containerID="4fa2acb1d0369e23bb7b9f896175d9a71261580a50338f3228baddf6bc79fee1" exitCode=0 Oct 01 13:15:25 crc kubenswrapper[4851]: I1001 13:15:25.479852 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" event={"ID":"a82d9b2c-d132-4496-92f6-9fea17968b18","Type":"ContainerDied","Data":"4fa2acb1d0369e23bb7b9f896175d9a71261580a50338f3228baddf6bc79fee1"} Oct 01 13:15:25 crc kubenswrapper[4851]: I1001 13:15:25.529382 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 13:15:25 crc kubenswrapper[4851]: I1001 13:15:25.938223 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:15:25 crc kubenswrapper[4851]: I1001 13:15:25.942405 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.103117 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-config-data\") pod \"ed619e77-8ae0-4eaf-8c68-611b8883f603\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.103432 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-ovsdbserver-sb\") pod \"a82d9b2c-d132-4496-92f6-9fea17968b18\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.103459 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxh7v\" (UniqueName: \"kubernetes.io/projected/a82d9b2c-d132-4496-92f6-9fea17968b18-kube-api-access-mxh7v\") pod \"a82d9b2c-d132-4496-92f6-9fea17968b18\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.103533 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjmfk\" (UniqueName: \"kubernetes.io/projected/ed619e77-8ae0-4eaf-8c68-611b8883f603-kube-api-access-cjmfk\") pod \"ed619e77-8ae0-4eaf-8c68-611b8883f603\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.103562 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-dns-svc\") pod \"a82d9b2c-d132-4496-92f6-9fea17968b18\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.103600 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-scripts\") pod \"ed619e77-8ae0-4eaf-8c68-611b8883f603\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.103625 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-config\") pod \"a82d9b2c-d132-4496-92f6-9fea17968b18\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.103644 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-combined-ca-bundle\") pod \"ed619e77-8ae0-4eaf-8c68-611b8883f603\" (UID: \"ed619e77-8ae0-4eaf-8c68-611b8883f603\") " Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.103691 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-ovsdbserver-nb\") pod \"a82d9b2c-d132-4496-92f6-9fea17968b18\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.103721 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-dns-swift-storage-0\") pod \"a82d9b2c-d132-4496-92f6-9fea17968b18\" (UID: \"a82d9b2c-d132-4496-92f6-9fea17968b18\") " Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.110044 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a82d9b2c-d132-4496-92f6-9fea17968b18-kube-api-access-mxh7v" (OuterVolumeSpecName: "kube-api-access-mxh7v") pod "a82d9b2c-d132-4496-92f6-9fea17968b18" (UID: "a82d9b2c-d132-4496-92f6-9fea17968b18"). InnerVolumeSpecName "kube-api-access-mxh7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.123598 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed619e77-8ae0-4eaf-8c68-611b8883f603-kube-api-access-cjmfk" (OuterVolumeSpecName: "kube-api-access-cjmfk") pod "ed619e77-8ae0-4eaf-8c68-611b8883f603" (UID: "ed619e77-8ae0-4eaf-8c68-611b8883f603"). InnerVolumeSpecName "kube-api-access-cjmfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.124062 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-scripts" (OuterVolumeSpecName: "scripts") pod "ed619e77-8ae0-4eaf-8c68-611b8883f603" (UID: "ed619e77-8ae0-4eaf-8c68-611b8883f603"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.140792 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed619e77-8ae0-4eaf-8c68-611b8883f603" (UID: "ed619e77-8ae0-4eaf-8c68-611b8883f603"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.156868 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a82d9b2c-d132-4496-92f6-9fea17968b18" (UID: "a82d9b2c-d132-4496-92f6-9fea17968b18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.160272 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a82d9b2c-d132-4496-92f6-9fea17968b18" (UID: "a82d9b2c-d132-4496-92f6-9fea17968b18"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.169540 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a82d9b2c-d132-4496-92f6-9fea17968b18" (UID: "a82d9b2c-d132-4496-92f6-9fea17968b18"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.173158 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-config-data" (OuterVolumeSpecName: "config-data") pod "ed619e77-8ae0-4eaf-8c68-611b8883f603" (UID: "ed619e77-8ae0-4eaf-8c68-611b8883f603"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.175630 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a82d9b2c-d132-4496-92f6-9fea17968b18" (UID: "a82d9b2c-d132-4496-92f6-9fea17968b18"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.186148 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-config" (OuterVolumeSpecName: "config") pod "a82d9b2c-d132-4496-92f6-9fea17968b18" (UID: "a82d9b2c-d132-4496-92f6-9fea17968b18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.208049 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.208076 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.208086 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.208095 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.208106 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.208113 4851 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.208122 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed619e77-8ae0-4eaf-8c68-611b8883f603-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.208133 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a82d9b2c-d132-4496-92f6-9fea17968b18-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.208142 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxh7v\" (UniqueName: \"kubernetes.io/projected/a82d9b2c-d132-4496-92f6-9fea17968b18-kube-api-access-mxh7v\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.208152 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjmfk\" (UniqueName: \"kubernetes.io/projected/ed619e77-8ae0-4eaf-8c68-611b8883f603-kube-api-access-cjmfk\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.283726 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1e85f88e-bad5-4b06-8767-b1720bacf2ea" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.283789 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1e85f88e-bad5-4b06-8767-b1720bacf2ea" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.384439 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.494430 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7shqh" event={"ID":"ed619e77-8ae0-4eaf-8c68-611b8883f603","Type":"ContainerDied","Data":"e9687405733e53e25d89dce15fec93c4a226014fd865ecffdbb050fae86edfce"} Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.494520 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9687405733e53e25d89dce15fec93c4a226014fd865ecffdbb050fae86edfce" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.494624 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7shqh" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.497539 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" event={"ID":"a82d9b2c-d132-4496-92f6-9fea17968b18","Type":"ContainerDied","Data":"0ae9514cd0da7cac0dd81906df8159dbdedefee33a0d326288625f207e3110e1"} Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.497596 4851 scope.go:117] "RemoveContainer" containerID="4fa2acb1d0369e23bb7b9f896175d9a71261580a50338f3228baddf6bc79fee1" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.497556 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-788dbb8495-dp5gr" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.523066 4851 scope.go:117] "RemoveContainer" containerID="6174c46436f76d97618e19239c65323c1c2e5203410f72bb5ba7d3d89c2d5413" Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.555258 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-788dbb8495-dp5gr"] Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.562004 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-788dbb8495-dp5gr"] Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.627531 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.627744 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1e85f88e-bad5-4b06-8767-b1720bacf2ea" containerName="nova-api-log" containerID="cri-o://29bdd0f6232c950c532fbfd0eeb68ef456e736441d843c146e37b969bc59871e" gracePeriod=30 Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.627867 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1e85f88e-bad5-4b06-8767-b1720bacf2ea" containerName="nova-api-api" containerID="cri-o://8caa28e34abe628c54bb4e1a0c9785f56476f4c1eb2b783560c3917b4ffc54cc" gracePeriod=30 Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.653203 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.653489 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cea8c682-c707-4a3c-a543-87fd2b68c9ef" containerName="nova-metadata-log" containerID="cri-o://220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7" gracePeriod=30 Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.653578 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cea8c682-c707-4a3c-a543-87fd2b68c9ef" containerName="nova-metadata-metadata" containerID="cri-o://254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a" gracePeriod=30 Oct 01 13:15:26 crc kubenswrapper[4851]: I1001 13:15:26.661827 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.172376 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.328716 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkdwj\" (UniqueName: \"kubernetes.io/projected/cea8c682-c707-4a3c-a543-87fd2b68c9ef-kube-api-access-kkdwj\") pod \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.328773 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea8c682-c707-4a3c-a543-87fd2b68c9ef-logs\") pod \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.328838 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-nova-metadata-tls-certs\") pod \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.328878 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-config-data\") pod \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.329269 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea8c682-c707-4a3c-a543-87fd2b68c9ef-logs" (OuterVolumeSpecName: "logs") pod "cea8c682-c707-4a3c-a543-87fd2b68c9ef" (UID: "cea8c682-c707-4a3c-a543-87fd2b68c9ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.329662 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-combined-ca-bundle\") pod \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\" (UID: \"cea8c682-c707-4a3c-a543-87fd2b68c9ef\") " Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.330274 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea8c682-c707-4a3c-a543-87fd2b68c9ef-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.333723 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea8c682-c707-4a3c-a543-87fd2b68c9ef-kube-api-access-kkdwj" (OuterVolumeSpecName: "kube-api-access-kkdwj") pod "cea8c682-c707-4a3c-a543-87fd2b68c9ef" (UID: "cea8c682-c707-4a3c-a543-87fd2b68c9ef"). InnerVolumeSpecName "kube-api-access-kkdwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.361216 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cea8c682-c707-4a3c-a543-87fd2b68c9ef" (UID: "cea8c682-c707-4a3c-a543-87fd2b68c9ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.366911 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-config-data" (OuterVolumeSpecName: "config-data") pod "cea8c682-c707-4a3c-a543-87fd2b68c9ef" (UID: "cea8c682-c707-4a3c-a543-87fd2b68c9ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.403419 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cea8c682-c707-4a3c-a543-87fd2b68c9ef" (UID: "cea8c682-c707-4a3c-a543-87fd2b68c9ef"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.431896 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.431940 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.431960 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkdwj\" (UniqueName: \"kubernetes.io/projected/cea8c682-c707-4a3c-a543-87fd2b68c9ef-kube-api-access-kkdwj\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.431980 4851 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea8c682-c707-4a3c-a543-87fd2b68c9ef-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.513543 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e85f88e-bad5-4b06-8767-b1720bacf2ea","Type":"ContainerDied","Data":"29bdd0f6232c950c532fbfd0eeb68ef456e736441d843c146e37b969bc59871e"} Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.513493 4851 generic.go:334] "Generic (PLEG): container finished" podID="1e85f88e-bad5-4b06-8767-b1720bacf2ea" containerID="29bdd0f6232c950c532fbfd0eeb68ef456e736441d843c146e37b969bc59871e" exitCode=143 Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.515464 4851 generic.go:334] "Generic (PLEG): container finished" podID="cea8c682-c707-4a3c-a543-87fd2b68c9ef" containerID="254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a" exitCode=0 Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.515489 4851 generic.go:334] "Generic (PLEG): container finished" podID="cea8c682-c707-4a3c-a543-87fd2b68c9ef" containerID="220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7" exitCode=143 Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.515641 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1d57440b-c82a-4b73-82ae-f9a432089203" containerName="nova-scheduler-scheduler" containerID="cri-o://5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6" gracePeriod=30 Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.515925 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.518618 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea8c682-c707-4a3c-a543-87fd2b68c9ef","Type":"ContainerDied","Data":"254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a"} Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.518647 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea8c682-c707-4a3c-a543-87fd2b68c9ef","Type":"ContainerDied","Data":"220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7"} Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.518658 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea8c682-c707-4a3c-a543-87fd2b68c9ef","Type":"ContainerDied","Data":"6bf8d812bb3bccef75d59caea8afc20931654b1a057fa6af70da94287787fa2a"} Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.518673 4851 scope.go:117] "RemoveContainer" containerID="254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.551165 4851 scope.go:117] "RemoveContainer" containerID="220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.557301 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.580998 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.584934 4851 scope.go:117] "RemoveContainer" containerID="254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a" Oct 01 13:15:27 crc kubenswrapper[4851]: E1001 13:15:27.587263 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a\": container with ID starting with 254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a not found: ID does not exist" containerID="254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.587292 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a"} err="failed to get container status \"254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a\": rpc error: code = NotFound desc = could not find container \"254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a\": container with ID starting with 254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a not found: ID does not exist" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.587314 4851 scope.go:117] "RemoveContainer" containerID="220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7" Oct 01 13:15:27 crc kubenswrapper[4851]: E1001 13:15:27.589414 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7\": container with ID starting with 220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7 not found: ID does not exist" containerID="220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.589439 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7"} err="failed to get container status \"220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7\": rpc error: code = NotFound desc = could not find container \"220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7\": container with ID starting with 220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7 not found: ID does not exist" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.589453 4851 scope.go:117] "RemoveContainer" containerID="254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.590772 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a"} err="failed to get container status \"254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a\": rpc error: code = NotFound desc = could not find container \"254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a\": container with ID starting with 254ab61bda91f7b245359ae3be65fe94dc63b2a4581fadde28c5fa5c78f03b1a not found: ID does not exist" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.590832 4851 scope.go:117] "RemoveContainer" containerID="220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.591396 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7"} err="failed to get container status \"220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7\": rpc error: code = NotFound desc = could not find container \"220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7\": container with ID starting with 220ff786f18813df6bfaebf315f3e89efb7446a47342d477f79eae111347f3e7 not found: ID does not exist" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.592486 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:27 crc kubenswrapper[4851]: E1001 13:15:27.593128 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a82d9b2c-d132-4496-92f6-9fea17968b18" containerName="dnsmasq-dns" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.593147 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="a82d9b2c-d132-4496-92f6-9fea17968b18" containerName="dnsmasq-dns" Oct 01 13:15:27 crc kubenswrapper[4851]: E1001 13:15:27.593164 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed619e77-8ae0-4eaf-8c68-611b8883f603" containerName="nova-manage" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.593170 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed619e77-8ae0-4eaf-8c68-611b8883f603" containerName="nova-manage" Oct 01 13:15:27 crc kubenswrapper[4851]: E1001 13:15:27.593191 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a82d9b2c-d132-4496-92f6-9fea17968b18" containerName="init" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.593196 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="a82d9b2c-d132-4496-92f6-9fea17968b18" containerName="init" Oct 01 13:15:27 crc kubenswrapper[4851]: E1001 13:15:27.593204 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea8c682-c707-4a3c-a543-87fd2b68c9ef" containerName="nova-metadata-metadata" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.593210 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea8c682-c707-4a3c-a543-87fd2b68c9ef" containerName="nova-metadata-metadata" Oct 01 13:15:27 crc kubenswrapper[4851]: E1001 13:15:27.593239 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea8c682-c707-4a3c-a543-87fd2b68c9ef" containerName="nova-metadata-log" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.593244 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea8c682-c707-4a3c-a543-87fd2b68c9ef" containerName="nova-metadata-log" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.593424 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea8c682-c707-4a3c-a543-87fd2b68c9ef" containerName="nova-metadata-metadata" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.593447 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="a82d9b2c-d132-4496-92f6-9fea17968b18" containerName="dnsmasq-dns" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.593461 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea8c682-c707-4a3c-a543-87fd2b68c9ef" containerName="nova-metadata-log" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.593473 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed619e77-8ae0-4eaf-8c68-611b8883f603" containerName="nova-manage" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.594438 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.597362 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.597650 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.605172 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.738171 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.738247 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68393f76-bfd1-40fc-b0e7-01bd567a657e-logs\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.738282 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86l8l\" (UniqueName: \"kubernetes.io/projected/68393f76-bfd1-40fc-b0e7-01bd567a657e-kube-api-access-86l8l\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.738337 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.738353 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-config-data\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.840428 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68393f76-bfd1-40fc-b0e7-01bd567a657e-logs\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.840538 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86l8l\" (UniqueName: \"kubernetes.io/projected/68393f76-bfd1-40fc-b0e7-01bd567a657e-kube-api-access-86l8l\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.840619 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.840645 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-config-data\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.840780 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.840957 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68393f76-bfd1-40fc-b0e7-01bd567a657e-logs\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.845043 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-config-data\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.845312 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.845306 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.873426 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86l8l\" (UniqueName: \"kubernetes.io/projected/68393f76-bfd1-40fc-b0e7-01bd567a657e-kube-api-access-86l8l\") pod \"nova-metadata-0\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " pod="openstack/nova-metadata-0" Oct 01 13:15:27 crc kubenswrapper[4851]: I1001 13:15:27.911351 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:15:28 crc kubenswrapper[4851]: I1001 13:15:28.341889 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a82d9b2c-d132-4496-92f6-9fea17968b18" path="/var/lib/kubelet/pods/a82d9b2c-d132-4496-92f6-9fea17968b18/volumes" Oct 01 13:15:28 crc kubenswrapper[4851]: I1001 13:15:28.343296 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea8c682-c707-4a3c-a543-87fd2b68c9ef" path="/var/lib/kubelet/pods/cea8c682-c707-4a3c-a543-87fd2b68c9ef/volumes" Oct 01 13:15:28 crc kubenswrapper[4851]: W1001 13:15:28.394145 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68393f76_bfd1_40fc_b0e7_01bd567a657e.slice/crio-d3c40141346fba4e22c0b27631d452fd2b63df625157e4c71b55ffe79f5a2146 WatchSource:0}: Error finding container d3c40141346fba4e22c0b27631d452fd2b63df625157e4c71b55ffe79f5a2146: Status 404 returned error can't find the container with id d3c40141346fba4e22c0b27631d452fd2b63df625157e4c71b55ffe79f5a2146 Oct 01 13:15:28 crc kubenswrapper[4851]: I1001 13:15:28.396448 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:15:28 crc kubenswrapper[4851]: I1001 13:15:28.529277 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"68393f76-bfd1-40fc-b0e7-01bd567a657e","Type":"ContainerStarted","Data":"d3c40141346fba4e22c0b27631d452fd2b63df625157e4c71b55ffe79f5a2146"} Oct 01 13:15:29 crc kubenswrapper[4851]: I1001 13:15:29.542347 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"68393f76-bfd1-40fc-b0e7-01bd567a657e","Type":"ContainerStarted","Data":"e2980cda0bf29083305c6ab5e378b29e91db010a3c7bc1aba36d3d0f3d34cbd9"} Oct 01 13:15:29 crc kubenswrapper[4851]: I1001 13:15:29.542655 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"68393f76-bfd1-40fc-b0e7-01bd567a657e","Type":"ContainerStarted","Data":"363495ed94168b195c1ebffa2e9494d2399e56656a01d28b9b93f3e13406b579"} Oct 01 13:15:29 crc kubenswrapper[4851]: I1001 13:15:29.568616 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.568601825 podStartE2EDuration="2.568601825s" podCreationTimestamp="2025-10-01 13:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:15:29.560379832 +0000 UTC m=+1337.905497318" watchObservedRunningTime="2025-10-01 13:15:29.568601825 +0000 UTC m=+1337.913719311" Oct 01 13:15:30 crc kubenswrapper[4851]: I1001 13:15:30.050599 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:15:30 crc kubenswrapper[4851]: I1001 13:15:30.050674 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:15:30 crc kubenswrapper[4851]: E1001 13:15:30.081364 4851 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 13:15:30 crc kubenswrapper[4851]: E1001 13:15:30.082973 4851 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 13:15:30 crc kubenswrapper[4851]: E1001 13:15:30.084834 4851 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 13:15:30 crc kubenswrapper[4851]: E1001 13:15:30.084873 4851 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1d57440b-c82a-4b73-82ae-f9a432089203" containerName="nova-scheduler-scheduler" Oct 01 13:15:30 crc kubenswrapper[4851]: I1001 13:15:30.282070 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:15:30 crc kubenswrapper[4851]: I1001 13:15:30.282376 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5dbbdd15-a094-4982-a2f1-370f00a1b004" containerName="kube-state-metrics" containerID="cri-o://24fb2f570ee3764bef35b360f96d4fee8a188aee286c938cb5c8e7a885e0f12c" gracePeriod=30 Oct 01 13:15:30 crc kubenswrapper[4851]: I1001 13:15:30.571466 4851 generic.go:334] "Generic (PLEG): container finished" podID="5dbbdd15-a094-4982-a2f1-370f00a1b004" containerID="24fb2f570ee3764bef35b360f96d4fee8a188aee286c938cb5c8e7a885e0f12c" exitCode=2 Oct 01 13:15:30 crc kubenswrapper[4851]: I1001 13:15:30.571937 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5dbbdd15-a094-4982-a2f1-370f00a1b004","Type":"ContainerDied","Data":"24fb2f570ee3764bef35b360f96d4fee8a188aee286c938cb5c8e7a885e0f12c"} Oct 01 13:15:30 crc kubenswrapper[4851]: I1001 13:15:30.768791 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 13:15:30 crc kubenswrapper[4851]: I1001 13:15:30.899376 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drpnd\" (UniqueName: \"kubernetes.io/projected/5dbbdd15-a094-4982-a2f1-370f00a1b004-kube-api-access-drpnd\") pod \"5dbbdd15-a094-4982-a2f1-370f00a1b004\" (UID: \"5dbbdd15-a094-4982-a2f1-370f00a1b004\") " Oct 01 13:15:30 crc kubenswrapper[4851]: I1001 13:15:30.907417 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dbbdd15-a094-4982-a2f1-370f00a1b004-kube-api-access-drpnd" (OuterVolumeSpecName: "kube-api-access-drpnd") pod "5dbbdd15-a094-4982-a2f1-370f00a1b004" (UID: "5dbbdd15-a094-4982-a2f1-370f00a1b004"). InnerVolumeSpecName "kube-api-access-drpnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.001900 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drpnd\" (UniqueName: \"kubernetes.io/projected/5dbbdd15-a094-4982-a2f1-370f00a1b004-kube-api-access-drpnd\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.582649 4851 generic.go:334] "Generic (PLEG): container finished" podID="8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b" containerID="37fdd75b45485c986f60c17b2c07beddfef2a71d34a715c47676888c61459a4f" exitCode=0 Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.582741 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-q4cn9" event={"ID":"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b","Type":"ContainerDied","Data":"37fdd75b45485c986f60c17b2c07beddfef2a71d34a715c47676888c61459a4f"} Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.587262 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5dbbdd15-a094-4982-a2f1-370f00a1b004","Type":"ContainerDied","Data":"52d7eafdfc14345b22576837949d2cb2cb47e0fc0307800620f2901ce81bf079"} Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.587315 4851 scope.go:117] "RemoveContainer" containerID="24fb2f570ee3764bef35b360f96d4fee8a188aee286c938cb5c8e7a885e0f12c" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.587313 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.658511 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.683657 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.708577 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:15:31 crc kubenswrapper[4851]: E1001 13:15:31.708985 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbbdd15-a094-4982-a2f1-370f00a1b004" containerName="kube-state-metrics" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.708998 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbbdd15-a094-4982-a2f1-370f00a1b004" containerName="kube-state-metrics" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.709247 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dbbdd15-a094-4982-a2f1-370f00a1b004" containerName="kube-state-metrics" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.709959 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.712908 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.712993 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.725463 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.827920 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce21dde-21c2-49d7-aac0-cba896d9b1de-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3ce21dde-21c2-49d7-aac0-cba896d9b1de\") " pod="openstack/kube-state-metrics-0" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.828012 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3ce21dde-21c2-49d7-aac0-cba896d9b1de-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3ce21dde-21c2-49d7-aac0-cba896d9b1de\") " pod="openstack/kube-state-metrics-0" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.828049 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce21dde-21c2-49d7-aac0-cba896d9b1de-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3ce21dde-21c2-49d7-aac0-cba896d9b1de\") " pod="openstack/kube-state-metrics-0" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.828066 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p642\" (UniqueName: \"kubernetes.io/projected/3ce21dde-21c2-49d7-aac0-cba896d9b1de-kube-api-access-6p642\") pod \"kube-state-metrics-0\" (UID: \"3ce21dde-21c2-49d7-aac0-cba896d9b1de\") " pod="openstack/kube-state-metrics-0" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.932007 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce21dde-21c2-49d7-aac0-cba896d9b1de-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3ce21dde-21c2-49d7-aac0-cba896d9b1de\") " pod="openstack/kube-state-metrics-0" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.932057 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3ce21dde-21c2-49d7-aac0-cba896d9b1de-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3ce21dde-21c2-49d7-aac0-cba896d9b1de\") " pod="openstack/kube-state-metrics-0" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.932083 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce21dde-21c2-49d7-aac0-cba896d9b1de-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3ce21dde-21c2-49d7-aac0-cba896d9b1de\") " pod="openstack/kube-state-metrics-0" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.932101 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p642\" (UniqueName: \"kubernetes.io/projected/3ce21dde-21c2-49d7-aac0-cba896d9b1de-kube-api-access-6p642\") pod \"kube-state-metrics-0\" (UID: \"3ce21dde-21c2-49d7-aac0-cba896d9b1de\") " pod="openstack/kube-state-metrics-0" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.937715 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce21dde-21c2-49d7-aac0-cba896d9b1de-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3ce21dde-21c2-49d7-aac0-cba896d9b1de\") " pod="openstack/kube-state-metrics-0" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.937770 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3ce21dde-21c2-49d7-aac0-cba896d9b1de-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3ce21dde-21c2-49d7-aac0-cba896d9b1de\") " pod="openstack/kube-state-metrics-0" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.938679 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce21dde-21c2-49d7-aac0-cba896d9b1de-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3ce21dde-21c2-49d7-aac0-cba896d9b1de\") " pod="openstack/kube-state-metrics-0" Oct 01 13:15:31 crc kubenswrapper[4851]: I1001 13:15:31.949928 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p642\" (UniqueName: \"kubernetes.io/projected/3ce21dde-21c2-49d7-aac0-cba896d9b1de-kube-api-access-6p642\") pod \"kube-state-metrics-0\" (UID: \"3ce21dde-21c2-49d7-aac0-cba896d9b1de\") " pod="openstack/kube-state-metrics-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.030135 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.052973 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.138350 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e85f88e-bad5-4b06-8767-b1720bacf2ea-config-data\") pod \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.138602 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e85f88e-bad5-4b06-8767-b1720bacf2ea-logs\") pod \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.138736 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh7jh\" (UniqueName: \"kubernetes.io/projected/1e85f88e-bad5-4b06-8767-b1720bacf2ea-kube-api-access-mh7jh\") pod \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.138786 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e85f88e-bad5-4b06-8767-b1720bacf2ea-combined-ca-bundle\") pod \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\" (UID: \"1e85f88e-bad5-4b06-8767-b1720bacf2ea\") " Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.139055 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e85f88e-bad5-4b06-8767-b1720bacf2ea-logs" (OuterVolumeSpecName: "logs") pod "1e85f88e-bad5-4b06-8767-b1720bacf2ea" (UID: "1e85f88e-bad5-4b06-8767-b1720bacf2ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.139334 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e85f88e-bad5-4b06-8767-b1720bacf2ea-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.142609 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e85f88e-bad5-4b06-8767-b1720bacf2ea-kube-api-access-mh7jh" (OuterVolumeSpecName: "kube-api-access-mh7jh") pod "1e85f88e-bad5-4b06-8767-b1720bacf2ea" (UID: "1e85f88e-bad5-4b06-8767-b1720bacf2ea"). InnerVolumeSpecName "kube-api-access-mh7jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.172176 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e85f88e-bad5-4b06-8767-b1720bacf2ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e85f88e-bad5-4b06-8767-b1720bacf2ea" (UID: "1e85f88e-bad5-4b06-8767-b1720bacf2ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.182273 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e85f88e-bad5-4b06-8767-b1720bacf2ea-config-data" (OuterVolumeSpecName: "config-data") pod "1e85f88e-bad5-4b06-8767-b1720bacf2ea" (UID: "1e85f88e-bad5-4b06-8767-b1720bacf2ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.241682 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh7jh\" (UniqueName: \"kubernetes.io/projected/1e85f88e-bad5-4b06-8767-b1720bacf2ea-kube-api-access-mh7jh\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.241746 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e85f88e-bad5-4b06-8767-b1720bacf2ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.241760 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e85f88e-bad5-4b06-8767-b1720bacf2ea-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.258606 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.259317 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="sg-core" containerID="cri-o://d1049ceca9f4e8154f1b9b3e8f4314a6f27ddd2c51be7e3439319f6a7918d733" gracePeriod=30 Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.259491 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="proxy-httpd" containerID="cri-o://0d7557d025ec42c72d6586138fc901a704a8f48a305f88df5bea6a1e240b56d4" gracePeriod=30 Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.259856 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="ceilometer-notification-agent" containerID="cri-o://21fe2fda604a2deb41cb3d92c904d646bc494b49c80cf037831b88ac0604badf" gracePeriod=30 Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.270939 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="ceilometer-central-agent" containerID="cri-o://32a17062b9da1995cd1c182b579b4ded2007df740da4b0c82735407b44f536bc" gracePeriod=30 Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.339057 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dbbdd15-a094-4982-a2f1-370f00a1b004" path="/var/lib/kubelet/pods/5dbbdd15-a094-4982-a2f1-370f00a1b004/volumes" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.568664 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.580647 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.629730 4851 generic.go:334] "Generic (PLEG): container finished" podID="1e85f88e-bad5-4b06-8767-b1720bacf2ea" containerID="8caa28e34abe628c54bb4e1a0c9785f56476f4c1eb2b783560c3917b4ffc54cc" exitCode=0 Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.629781 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e85f88e-bad5-4b06-8767-b1720bacf2ea","Type":"ContainerDied","Data":"8caa28e34abe628c54bb4e1a0c9785f56476f4c1eb2b783560c3917b4ffc54cc"} Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.629805 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e85f88e-bad5-4b06-8767-b1720bacf2ea","Type":"ContainerDied","Data":"8a60607b540237f872f52dcee09e602d3da695b95b52af858ceb98bac6f7272d"} Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.629821 4851 scope.go:117] "RemoveContainer" containerID="8caa28e34abe628c54bb4e1a0c9785f56476f4c1eb2b783560c3917b4ffc54cc" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.629903 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.634267 4851 generic.go:334] "Generic (PLEG): container finished" podID="1d57440b-c82a-4b73-82ae-f9a432089203" containerID="5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6" exitCode=0 Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.634352 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.634733 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1d57440b-c82a-4b73-82ae-f9a432089203","Type":"ContainerDied","Data":"5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6"} Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.634759 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1d57440b-c82a-4b73-82ae-f9a432089203","Type":"ContainerDied","Data":"c31f4ff6a04e143d77295c65dc99eb274e725f272e14d23a67557303725b2f48"} Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.638558 4851 generic.go:334] "Generic (PLEG): container finished" podID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerID="0d7557d025ec42c72d6586138fc901a704a8f48a305f88df5bea6a1e240b56d4" exitCode=0 Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.638582 4851 generic.go:334] "Generic (PLEG): container finished" podID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerID="d1049ceca9f4e8154f1b9b3e8f4314a6f27ddd2c51be7e3439319f6a7918d733" exitCode=2 Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.638618 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe30e726-ab37-4631-ac05-9d8bd9fdc11d","Type":"ContainerDied","Data":"0d7557d025ec42c72d6586138fc901a704a8f48a305f88df5bea6a1e240b56d4"} Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.638639 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe30e726-ab37-4631-ac05-9d8bd9fdc11d","Type":"ContainerDied","Data":"d1049ceca9f4e8154f1b9b3e8f4314a6f27ddd2c51be7e3439319f6a7918d733"} Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.655385 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.675651 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.677869 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3ce21dde-21c2-49d7-aac0-cba896d9b1de","Type":"ContainerStarted","Data":"48a0f1da290b148b4ec76c82fcf465e424fc357485cf09ababc44b82807c47fd"} Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.685298 4851 scope.go:117] "RemoveContainer" containerID="29bdd0f6232c950c532fbfd0eeb68ef456e736441d843c146e37b969bc59871e" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.701607 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 13:15:32 crc kubenswrapper[4851]: E1001 13:15:32.704197 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e85f88e-bad5-4b06-8767-b1720bacf2ea" containerName="nova-api-api" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.704230 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e85f88e-bad5-4b06-8767-b1720bacf2ea" containerName="nova-api-api" Oct 01 13:15:32 crc kubenswrapper[4851]: E1001 13:15:32.704246 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d57440b-c82a-4b73-82ae-f9a432089203" containerName="nova-scheduler-scheduler" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.704252 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d57440b-c82a-4b73-82ae-f9a432089203" containerName="nova-scheduler-scheduler" Oct 01 13:15:32 crc kubenswrapper[4851]: E1001 13:15:32.704276 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e85f88e-bad5-4b06-8767-b1720bacf2ea" containerName="nova-api-log" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.704284 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e85f88e-bad5-4b06-8767-b1720bacf2ea" containerName="nova-api-log" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.704572 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e85f88e-bad5-4b06-8767-b1720bacf2ea" containerName="nova-api-api" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.704603 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e85f88e-bad5-4b06-8767-b1720bacf2ea" containerName="nova-api-log" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.704638 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d57440b-c82a-4b73-82ae-f9a432089203" containerName="nova-scheduler-scheduler" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.705669 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.711696 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.733416 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.740202 4851 scope.go:117] "RemoveContainer" containerID="8caa28e34abe628c54bb4e1a0c9785f56476f4c1eb2b783560c3917b4ffc54cc" Oct 01 13:15:32 crc kubenswrapper[4851]: E1001 13:15:32.740575 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8caa28e34abe628c54bb4e1a0c9785f56476f4c1eb2b783560c3917b4ffc54cc\": container with ID starting with 8caa28e34abe628c54bb4e1a0c9785f56476f4c1eb2b783560c3917b4ffc54cc not found: ID does not exist" containerID="8caa28e34abe628c54bb4e1a0c9785f56476f4c1eb2b783560c3917b4ffc54cc" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.740897 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8caa28e34abe628c54bb4e1a0c9785f56476f4c1eb2b783560c3917b4ffc54cc"} err="failed to get container status \"8caa28e34abe628c54bb4e1a0c9785f56476f4c1eb2b783560c3917b4ffc54cc\": rpc error: code = NotFound desc = could not find container \"8caa28e34abe628c54bb4e1a0c9785f56476f4c1eb2b783560c3917b4ffc54cc\": container with ID starting with 8caa28e34abe628c54bb4e1a0c9785f56476f4c1eb2b783560c3917b4ffc54cc not found: ID does not exist" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.740929 4851 scope.go:117] "RemoveContainer" containerID="29bdd0f6232c950c532fbfd0eeb68ef456e736441d843c146e37b969bc59871e" Oct 01 13:15:32 crc kubenswrapper[4851]: E1001 13:15:32.741247 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29bdd0f6232c950c532fbfd0eeb68ef456e736441d843c146e37b969bc59871e\": container with ID starting with 29bdd0f6232c950c532fbfd0eeb68ef456e736441d843c146e37b969bc59871e not found: ID does not exist" containerID="29bdd0f6232c950c532fbfd0eeb68ef456e736441d843c146e37b969bc59871e" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.741291 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29bdd0f6232c950c532fbfd0eeb68ef456e736441d843c146e37b969bc59871e"} err="failed to get container status \"29bdd0f6232c950c532fbfd0eeb68ef456e736441d843c146e37b969bc59871e\": rpc error: code = NotFound desc = could not find container \"29bdd0f6232c950c532fbfd0eeb68ef456e736441d843c146e37b969bc59871e\": container with ID starting with 29bdd0f6232c950c532fbfd0eeb68ef456e736441d843c146e37b969bc59871e not found: ID does not exist" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.741321 4851 scope.go:117] "RemoveContainer" containerID="5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.753738 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfs24\" (UniqueName: \"kubernetes.io/projected/1d57440b-c82a-4b73-82ae-f9a432089203-kube-api-access-xfs24\") pod \"1d57440b-c82a-4b73-82ae-f9a432089203\" (UID: \"1d57440b-c82a-4b73-82ae-f9a432089203\") " Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.753934 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d57440b-c82a-4b73-82ae-f9a432089203-combined-ca-bundle\") pod \"1d57440b-c82a-4b73-82ae-f9a432089203\" (UID: \"1d57440b-c82a-4b73-82ae-f9a432089203\") " Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.753965 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d57440b-c82a-4b73-82ae-f9a432089203-config-data\") pod \"1d57440b-c82a-4b73-82ae-f9a432089203\" (UID: \"1d57440b-c82a-4b73-82ae-f9a432089203\") " Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.760667 4851 scope.go:117] "RemoveContainer" containerID="5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6" Oct 01 13:15:32 crc kubenswrapper[4851]: E1001 13:15:32.761118 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6\": container with ID starting with 5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6 not found: ID does not exist" containerID="5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.761224 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6"} err="failed to get container status \"5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6\": rpc error: code = NotFound desc = could not find container \"5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6\": container with ID starting with 5e810e5fad406c0d942c48802eb8d00d1aa33a47cf0d2d95e9086256e13ea5b6 not found: ID does not exist" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.780730 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d57440b-c82a-4b73-82ae-f9a432089203-kube-api-access-xfs24" (OuterVolumeSpecName: "kube-api-access-xfs24") pod "1d57440b-c82a-4b73-82ae-f9a432089203" (UID: "1d57440b-c82a-4b73-82ae-f9a432089203"). InnerVolumeSpecName "kube-api-access-xfs24". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.786632 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d57440b-c82a-4b73-82ae-f9a432089203-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d57440b-c82a-4b73-82ae-f9a432089203" (UID: "1d57440b-c82a-4b73-82ae-f9a432089203"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.794515 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d57440b-c82a-4b73-82ae-f9a432089203-config-data" (OuterVolumeSpecName: "config-data") pod "1d57440b-c82a-4b73-82ae-f9a432089203" (UID: "1d57440b-c82a-4b73-82ae-f9a432089203"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.857120 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cbbb39-fa41-45f3-9d97-a9509e7983ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.857174 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhqsv\" (UniqueName: \"kubernetes.io/projected/97cbbb39-fa41-45f3-9d97-a9509e7983ae-kube-api-access-dhqsv\") pod \"nova-api-0\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.857292 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97cbbb39-fa41-45f3-9d97-a9509e7983ae-logs\") pod \"nova-api-0\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.857564 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97cbbb39-fa41-45f3-9d97-a9509e7983ae-config-data\") pod \"nova-api-0\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.857705 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d57440b-c82a-4b73-82ae-f9a432089203-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.857719 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d57440b-c82a-4b73-82ae-f9a432089203-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.857733 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfs24\" (UniqueName: \"kubernetes.io/projected/1d57440b-c82a-4b73-82ae-f9a432089203-kube-api-access-xfs24\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.911906 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.911952 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.963634 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97cbbb39-fa41-45f3-9d97-a9509e7983ae-config-data\") pod \"nova-api-0\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.963717 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cbbb39-fa41-45f3-9d97-a9509e7983ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.963751 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhqsv\" (UniqueName: \"kubernetes.io/projected/97cbbb39-fa41-45f3-9d97-a9509e7983ae-kube-api-access-dhqsv\") pod \"nova-api-0\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.963788 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97cbbb39-fa41-45f3-9d97-a9509e7983ae-logs\") pod \"nova-api-0\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.964667 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97cbbb39-fa41-45f3-9d97-a9509e7983ae-logs\") pod \"nova-api-0\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.967494 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cbbb39-fa41-45f3-9d97-a9509e7983ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.967570 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97cbbb39-fa41-45f3-9d97-a9509e7983ae-config-data\") pod \"nova-api-0\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.985201 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.987325 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhqsv\" (UniqueName: \"kubernetes.io/projected/97cbbb39-fa41-45f3-9d97-a9509e7983ae-kube-api-access-dhqsv\") pod \"nova-api-0\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " pod="openstack/nova-api-0" Oct 01 13:15:32 crc kubenswrapper[4851]: I1001 13:15:32.997092 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.007839 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.009682 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.011673 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.033118 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.040643 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.093833 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.170667 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58cd22d1-6ef7-4719-bf22-8658c400786c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"58cd22d1-6ef7-4719-bf22-8658c400786c\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.170913 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58cd22d1-6ef7-4719-bf22-8658c400786c-config-data\") pod \"nova-scheduler-0\" (UID: \"58cd22d1-6ef7-4719-bf22-8658c400786c\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.171127 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg2pw\" (UniqueName: \"kubernetes.io/projected/58cd22d1-6ef7-4719-bf22-8658c400786c-kube-api-access-cg2pw\") pod \"nova-scheduler-0\" (UID: \"58cd22d1-6ef7-4719-bf22-8658c400786c\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.272266 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d6tw\" (UniqueName: \"kubernetes.io/projected/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-kube-api-access-2d6tw\") pod \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.272368 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-combined-ca-bundle\") pod \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.272440 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-scripts\") pod \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.272489 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-config-data\") pod \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\" (UID: \"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b\") " Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.272817 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg2pw\" (UniqueName: \"kubernetes.io/projected/58cd22d1-6ef7-4719-bf22-8658c400786c-kube-api-access-cg2pw\") pod \"nova-scheduler-0\" (UID: \"58cd22d1-6ef7-4719-bf22-8658c400786c\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.272891 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58cd22d1-6ef7-4719-bf22-8658c400786c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"58cd22d1-6ef7-4719-bf22-8658c400786c\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.272976 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58cd22d1-6ef7-4719-bf22-8658c400786c-config-data\") pod \"nova-scheduler-0\" (UID: \"58cd22d1-6ef7-4719-bf22-8658c400786c\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.277134 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-kube-api-access-2d6tw" (OuterVolumeSpecName: "kube-api-access-2d6tw") pod "8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b" (UID: "8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b"). InnerVolumeSpecName "kube-api-access-2d6tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.278033 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58cd22d1-6ef7-4719-bf22-8658c400786c-config-data\") pod \"nova-scheduler-0\" (UID: \"58cd22d1-6ef7-4719-bf22-8658c400786c\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.278242 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58cd22d1-6ef7-4719-bf22-8658c400786c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"58cd22d1-6ef7-4719-bf22-8658c400786c\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.280731 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-scripts" (OuterVolumeSpecName: "scripts") pod "8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b" (UID: "8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.295907 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg2pw\" (UniqueName: \"kubernetes.io/projected/58cd22d1-6ef7-4719-bf22-8658c400786c-kube-api-access-cg2pw\") pod \"nova-scheduler-0\" (UID: \"58cd22d1-6ef7-4719-bf22-8658c400786c\") " pod="openstack/nova-scheduler-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.306277 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b" (UID: "8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.312642 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-config-data" (OuterVolumeSpecName: "config-data") pod "8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b" (UID: "8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.374757 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.374798 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.374810 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d6tw\" (UniqueName: \"kubernetes.io/projected/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-kube-api-access-2d6tw\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.374819 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.401588 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.544235 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:15:33 crc kubenswrapper[4851]: W1001 13:15:33.548779 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97cbbb39_fa41_45f3_9d97_a9509e7983ae.slice/crio-64d10fb013c28a3f9d0dc8010dd05c2305843d8399153b81a1581b518a2d4ae4 WatchSource:0}: Error finding container 64d10fb013c28a3f9d0dc8010dd05c2305843d8399153b81a1581b518a2d4ae4: Status 404 returned error can't find the container with id 64d10fb013c28a3f9d0dc8010dd05c2305843d8399153b81a1581b518a2d4ae4 Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.704122 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 13:15:33 crc kubenswrapper[4851]: E1001 13:15:33.704622 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b" containerName="nova-cell1-conductor-db-sync" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.704638 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b" containerName="nova-cell1-conductor-db-sync" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.704849 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b" containerName="nova-cell1-conductor-db-sync" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.705594 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.735049 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97cbbb39-fa41-45f3-9d97-a9509e7983ae","Type":"ContainerStarted","Data":"64d10fb013c28a3f9d0dc8010dd05c2305843d8399153b81a1581b518a2d4ae4"} Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.740357 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.755624 4851 generic.go:334] "Generic (PLEG): container finished" podID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerID="32a17062b9da1995cd1c182b579b4ded2007df740da4b0c82735407b44f536bc" exitCode=0 Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.755748 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe30e726-ab37-4631-ac05-9d8bd9fdc11d","Type":"ContainerDied","Data":"32a17062b9da1995cd1c182b579b4ded2007df740da4b0c82735407b44f536bc"} Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.758103 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-q4cn9" event={"ID":"8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b","Type":"ContainerDied","Data":"d8cb411493f6acb9ff39004fbcf3e39a1d996238b7a1c7e010c25d22bfbe03e0"} Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.758138 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8cb411493f6acb9ff39004fbcf3e39a1d996238b7a1c7e010c25d22bfbe03e0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.758207 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-q4cn9" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.762342 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3ce21dde-21c2-49d7-aac0-cba896d9b1de","Type":"ContainerStarted","Data":"1a31587cd678127c95343ac985fe92cf05cc5fa97164f2db0be67fb7fd233c2e"} Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.763292 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.789658 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.308931164 podStartE2EDuration="2.789639062s" podCreationTimestamp="2025-10-01 13:15:31 +0000 UTC" firstStartedPulling="2025-10-01 13:15:32.589143654 +0000 UTC m=+1340.934261140" lastFinishedPulling="2025-10-01 13:15:33.069851552 +0000 UTC m=+1341.414969038" observedRunningTime="2025-10-01 13:15:33.782154918 +0000 UTC m=+1342.127272404" watchObservedRunningTime="2025-10-01 13:15:33.789639062 +0000 UTC m=+1342.134756548" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.884870 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2cacf3-b4a4-4839-a723-4bd4578935a7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9e2cacf3-b4a4-4839-a723-4bd4578935a7\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.885251 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbvtk\" (UniqueName: \"kubernetes.io/projected/9e2cacf3-b4a4-4839-a723-4bd4578935a7-kube-api-access-pbvtk\") pod \"nova-cell1-conductor-0\" (UID: \"9e2cacf3-b4a4-4839-a723-4bd4578935a7\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.885431 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e2cacf3-b4a4-4839-a723-4bd4578935a7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9e2cacf3-b4a4-4839-a723-4bd4578935a7\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.917872 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:15:33 crc kubenswrapper[4851]: W1001 13:15:33.920630 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58cd22d1_6ef7_4719_bf22_8658c400786c.slice/crio-5544a49d707273c984c567effd50e7610de68866cd1ebb5f729b9f0a7723d49d WatchSource:0}: Error finding container 5544a49d707273c984c567effd50e7610de68866cd1ebb5f729b9f0a7723d49d: Status 404 returned error can't find the container with id 5544a49d707273c984c567effd50e7610de68866cd1ebb5f729b9f0a7723d49d Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.986711 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e2cacf3-b4a4-4839-a723-4bd4578935a7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9e2cacf3-b4a4-4839-a723-4bd4578935a7\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.986817 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2cacf3-b4a4-4839-a723-4bd4578935a7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9e2cacf3-b4a4-4839-a723-4bd4578935a7\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.986856 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbvtk\" (UniqueName: \"kubernetes.io/projected/9e2cacf3-b4a4-4839-a723-4bd4578935a7-kube-api-access-pbvtk\") pod \"nova-cell1-conductor-0\" (UID: \"9e2cacf3-b4a4-4839-a723-4bd4578935a7\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.990704 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2cacf3-b4a4-4839-a723-4bd4578935a7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9e2cacf3-b4a4-4839-a723-4bd4578935a7\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:15:33 crc kubenswrapper[4851]: I1001 13:15:33.991445 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e2cacf3-b4a4-4839-a723-4bd4578935a7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9e2cacf3-b4a4-4839-a723-4bd4578935a7\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.007828 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbvtk\" (UniqueName: \"kubernetes.io/projected/9e2cacf3-b4a4-4839-a723-4bd4578935a7-kube-api-access-pbvtk\") pod \"nova-cell1-conductor-0\" (UID: \"9e2cacf3-b4a4-4839-a723-4bd4578935a7\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.054990 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.339565 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d57440b-c82a-4b73-82ae-f9a432089203" path="/var/lib/kubelet/pods/1d57440b-c82a-4b73-82ae-f9a432089203/volumes" Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.340356 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e85f88e-bad5-4b06-8767-b1720bacf2ea" path="/var/lib/kubelet/pods/1e85f88e-bad5-4b06-8767-b1720bacf2ea/volumes" Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.491273 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.774984 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97cbbb39-fa41-45f3-9d97-a9509e7983ae","Type":"ContainerStarted","Data":"675bffdac6025adc8c329864bb268c6ce7ff9509988afd05e4e88b5e450e77f6"} Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.775359 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97cbbb39-fa41-45f3-9d97-a9509e7983ae","Type":"ContainerStarted","Data":"304778a0f457a1dd5fe8201652857cd01f42b454123e4963eed28bf770620bbc"} Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.778158 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9e2cacf3-b4a4-4839-a723-4bd4578935a7","Type":"ContainerStarted","Data":"95bc2c7839da6a985bf65478cef8de59b24b504e680b8f6728efc75a729585fc"} Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.778221 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9e2cacf3-b4a4-4839-a723-4bd4578935a7","Type":"ContainerStarted","Data":"5bb1a25cb6a0f8d890ef5b1e3261dc6637234b4caf3e835a7a9163841d31914d"} Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.779871 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.784029 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"58cd22d1-6ef7-4719-bf22-8658c400786c","Type":"ContainerStarted","Data":"7c598c2dc9bd731a9c1659e63f57a189fd4d020c746a568812112b0d92d7a536"} Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.784262 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"58cd22d1-6ef7-4719-bf22-8658c400786c","Type":"ContainerStarted","Data":"5544a49d707273c984c567effd50e7610de68866cd1ebb5f729b9f0a7723d49d"} Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.802392 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.802374188 podStartE2EDuration="2.802374188s" podCreationTimestamp="2025-10-01 13:15:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:15:34.795948493 +0000 UTC m=+1343.141065989" watchObservedRunningTime="2025-10-01 13:15:34.802374188 +0000 UTC m=+1343.147491674" Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.821013 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.820988115 podStartE2EDuration="2.820988115s" podCreationTimestamp="2025-10-01 13:15:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:15:34.810193161 +0000 UTC m=+1343.155310657" watchObservedRunningTime="2025-10-01 13:15:34.820988115 +0000 UTC m=+1343.166105601" Oct 01 13:15:34 crc kubenswrapper[4851]: I1001 13:15:34.839166 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.839142239 podStartE2EDuration="1.839142239s" podCreationTimestamp="2025-10-01 13:15:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:15:34.829096205 +0000 UTC m=+1343.174213711" watchObservedRunningTime="2025-10-01 13:15:34.839142239 +0000 UTC m=+1343.184259735" Oct 01 13:15:37 crc kubenswrapper[4851]: I1001 13:15:37.911568 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 13:15:37 crc kubenswrapper[4851]: I1001 13:15:37.911998 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 13:15:38 crc kubenswrapper[4851]: I1001 13:15:38.402711 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 13:15:38 crc kubenswrapper[4851]: I1001 13:15:38.845059 4851 generic.go:334] "Generic (PLEG): container finished" podID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerID="21fe2fda604a2deb41cb3d92c904d646bc494b49c80cf037831b88ac0604badf" exitCode=0 Oct 01 13:15:38 crc kubenswrapper[4851]: I1001 13:15:38.845107 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe30e726-ab37-4631-ac05-9d8bd9fdc11d","Type":"ContainerDied","Data":"21fe2fda604a2deb41cb3d92c904d646bc494b49c80cf037831b88ac0604badf"} Oct 01 13:15:38 crc kubenswrapper[4851]: I1001 13:15:38.926320 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="68393f76-bfd1-40fc-b0e7-01bd567a657e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:15:38 crc kubenswrapper[4851]: I1001 13:15:38.928524 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="68393f76-bfd1-40fc-b0e7-01bd567a657e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.082993 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.288194 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.434669 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ctrt\" (UniqueName: \"kubernetes.io/projected/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-kube-api-access-4ctrt\") pod \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.434747 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-log-httpd\") pod \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.434779 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-combined-ca-bundle\") pod \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.434809 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-run-httpd\") pod \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.434845 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-sg-core-conf-yaml\") pod \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.434922 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-config-data\") pod \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.434988 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-scripts\") pod \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\" (UID: \"fe30e726-ab37-4631-ac05-9d8bd9fdc11d\") " Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.435588 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fe30e726-ab37-4631-ac05-9d8bd9fdc11d" (UID: "fe30e726-ab37-4631-ac05-9d8bd9fdc11d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.435671 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fe30e726-ab37-4631-ac05-9d8bd9fdc11d" (UID: "fe30e726-ab37-4631-ac05-9d8bd9fdc11d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.436146 4851 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.436172 4851 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.442865 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-scripts" (OuterVolumeSpecName: "scripts") pod "fe30e726-ab37-4631-ac05-9d8bd9fdc11d" (UID: "fe30e726-ab37-4631-ac05-9d8bd9fdc11d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.443011 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-kube-api-access-4ctrt" (OuterVolumeSpecName: "kube-api-access-4ctrt") pod "fe30e726-ab37-4631-ac05-9d8bd9fdc11d" (UID: "fe30e726-ab37-4631-ac05-9d8bd9fdc11d"). InnerVolumeSpecName "kube-api-access-4ctrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.497700 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fe30e726-ab37-4631-ac05-9d8bd9fdc11d" (UID: "fe30e726-ab37-4631-ac05-9d8bd9fdc11d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.535123 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-config-data" (OuterVolumeSpecName: "config-data") pod "fe30e726-ab37-4631-ac05-9d8bd9fdc11d" (UID: "fe30e726-ab37-4631-ac05-9d8bd9fdc11d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.538565 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ctrt\" (UniqueName: \"kubernetes.io/projected/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-kube-api-access-4ctrt\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.538599 4851 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.538611 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.538623 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.558696 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe30e726-ab37-4631-ac05-9d8bd9fdc11d" (UID: "fe30e726-ab37-4631-ac05-9d8bd9fdc11d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.640246 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe30e726-ab37-4631-ac05-9d8bd9fdc11d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.858186 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe30e726-ab37-4631-ac05-9d8bd9fdc11d","Type":"ContainerDied","Data":"333f1abc7f6a0888e59f60685b953575bd5fe206d8c3d9b57aa62986623505ee"} Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.858238 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.858257 4851 scope.go:117] "RemoveContainer" containerID="0d7557d025ec42c72d6586138fc901a704a8f48a305f88df5bea6a1e240b56d4" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.879815 4851 scope.go:117] "RemoveContainer" containerID="d1049ceca9f4e8154f1b9b3e8f4314a6f27ddd2c51be7e3439319f6a7918d733" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.898116 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.908344 4851 scope.go:117] "RemoveContainer" containerID="21fe2fda604a2deb41cb3d92c904d646bc494b49c80cf037831b88ac0604badf" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.920235 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.930950 4851 scope.go:117] "RemoveContainer" containerID="32a17062b9da1995cd1c182b579b4ded2007df740da4b0c82735407b44f536bc" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.933010 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:15:39 crc kubenswrapper[4851]: E1001 13:15:39.933626 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="proxy-httpd" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.933655 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="proxy-httpd" Oct 01 13:15:39 crc kubenswrapper[4851]: E1001 13:15:39.933681 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="ceilometer-central-agent" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.933691 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="ceilometer-central-agent" Oct 01 13:15:39 crc kubenswrapper[4851]: E1001 13:15:39.933716 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="sg-core" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.933725 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="sg-core" Oct 01 13:15:39 crc kubenswrapper[4851]: E1001 13:15:39.933744 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="ceilometer-notification-agent" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.933753 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="ceilometer-notification-agent" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.934037 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="sg-core" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.934067 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="proxy-httpd" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.934088 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="ceilometer-central-agent" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.934110 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" containerName="ceilometer-notification-agent" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.936470 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.939729 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.942720 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.946107 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:15:39 crc kubenswrapper[4851]: I1001 13:15:39.946721 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.049205 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.049261 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9kb6\" (UniqueName: \"kubernetes.io/projected/3db4192e-1ee2-4dca-98b2-65991b966ebe-kube-api-access-d9kb6\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.049321 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-config-data\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.049359 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.049385 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db4192e-1ee2-4dca-98b2-65991b966ebe-run-httpd\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.049441 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db4192e-1ee2-4dca-98b2-65991b966ebe-log-httpd\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.049603 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-scripts\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.049673 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.154616 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-config-data\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.154683 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.154725 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db4192e-1ee2-4dca-98b2-65991b966ebe-run-httpd\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.154840 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db4192e-1ee2-4dca-98b2-65991b966ebe-log-httpd\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.154936 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-scripts\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.155004 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.155485 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db4192e-1ee2-4dca-98b2-65991b966ebe-run-httpd\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.156143 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.156204 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9kb6\" (UniqueName: \"kubernetes.io/projected/3db4192e-1ee2-4dca-98b2-65991b966ebe-kube-api-access-d9kb6\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.156338 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db4192e-1ee2-4dca-98b2-65991b966ebe-log-httpd\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.158716 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-config-data\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.161078 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.161222 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.161668 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-scripts\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.163081 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.177184 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9kb6\" (UniqueName: \"kubernetes.io/projected/3db4192e-1ee2-4dca-98b2-65991b966ebe-kube-api-access-d9kb6\") pod \"ceilometer-0\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.287182 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.343847 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe30e726-ab37-4631-ac05-9d8bd9fdc11d" path="/var/lib/kubelet/pods/fe30e726-ab37-4631-ac05-9d8bd9fdc11d/volumes" Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.745894 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:15:40 crc kubenswrapper[4851]: I1001 13:15:40.868651 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db4192e-1ee2-4dca-98b2-65991b966ebe","Type":"ContainerStarted","Data":"c2af9936437fbd909071572649c3145f939ff3c62dce4da04b7b3044cf3e7f52"} Oct 01 13:15:41 crc kubenswrapper[4851]: I1001 13:15:41.883332 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db4192e-1ee2-4dca-98b2-65991b966ebe","Type":"ContainerStarted","Data":"7210d3ee7aee58e182043a786847899de398eb05a67afad31a7f7c0dfcfd21e0"} Oct 01 13:15:41 crc kubenswrapper[4851]: I1001 13:15:41.883762 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db4192e-1ee2-4dca-98b2-65991b966ebe","Type":"ContainerStarted","Data":"d11cb0a0b8196f9e568c26db821234d4cbbbc013304e091b3f4129e96e63929f"} Oct 01 13:15:42 crc kubenswrapper[4851]: I1001 13:15:42.050748 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 13:15:42 crc kubenswrapper[4851]: I1001 13:15:42.894730 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db4192e-1ee2-4dca-98b2-65991b966ebe","Type":"ContainerStarted","Data":"eef80d698e37bc95f05daa2017406b124eb066e4fa7658d0e6ea804b73f00b59"} Oct 01 13:15:43 crc kubenswrapper[4851]: I1001 13:15:43.042193 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:15:43 crc kubenswrapper[4851]: I1001 13:15:43.042626 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:15:43 crc kubenswrapper[4851]: I1001 13:15:43.402656 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 13:15:43 crc kubenswrapper[4851]: I1001 13:15:43.437109 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 13:15:43 crc kubenswrapper[4851]: I1001 13:15:43.951047 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 13:15:44 crc kubenswrapper[4851]: I1001 13:15:44.125749 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="97cbbb39-fa41-45f3-9d97-a9509e7983ae" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 13:15:44 crc kubenswrapper[4851]: I1001 13:15:44.126100 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="97cbbb39-fa41-45f3-9d97-a9509e7983ae" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 13:15:44 crc kubenswrapper[4851]: I1001 13:15:44.919334 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db4192e-1ee2-4dca-98b2-65991b966ebe","Type":"ContainerStarted","Data":"33be75013d251c492a7e9895df60369fddf8f399140bda47cb8159ceadd2de2a"} Oct 01 13:15:44 crc kubenswrapper[4851]: I1001 13:15:44.919740 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:15:44 crc kubenswrapper[4851]: I1001 13:15:44.950944 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.638166721 podStartE2EDuration="5.950916005s" podCreationTimestamp="2025-10-01 13:15:39 +0000 UTC" firstStartedPulling="2025-10-01 13:15:40.755677992 +0000 UTC m=+1349.100795478" lastFinishedPulling="2025-10-01 13:15:44.068427266 +0000 UTC m=+1352.413544762" observedRunningTime="2025-10-01 13:15:44.945846067 +0000 UTC m=+1353.290963553" watchObservedRunningTime="2025-10-01 13:15:44.950916005 +0000 UTC m=+1353.296033491" Oct 01 13:15:47 crc kubenswrapper[4851]: I1001 13:15:47.919873 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 13:15:47 crc kubenswrapper[4851]: I1001 13:15:47.929539 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 13:15:47 crc kubenswrapper[4851]: I1001 13:15:47.931808 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 13:15:47 crc kubenswrapper[4851]: I1001 13:15:47.978446 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 13:15:49 crc kubenswrapper[4851]: I1001 13:15:49.799521 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:49 crc kubenswrapper[4851]: I1001 13:15:49.971223 4851 generic.go:334] "Generic (PLEG): container finished" podID="16a3f1fc-2d6b-4d3a-95c3-b36258749d53" containerID="0bca02248dc1c251e8985412a6a8c29bc91aaed972b3ba3d5795a5fa1e8b1be5" exitCode=137 Oct 01 13:15:49 crc kubenswrapper[4851]: I1001 13:15:49.971267 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:49 crc kubenswrapper[4851]: I1001 13:15:49.971334 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"16a3f1fc-2d6b-4d3a-95c3-b36258749d53","Type":"ContainerDied","Data":"0bca02248dc1c251e8985412a6a8c29bc91aaed972b3ba3d5795a5fa1e8b1be5"} Oct 01 13:15:49 crc kubenswrapper[4851]: I1001 13:15:49.971382 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"16a3f1fc-2d6b-4d3a-95c3-b36258749d53","Type":"ContainerDied","Data":"ef3825674fc0274179c024445cef7fe57746075815a0e6f5ec5ac603c63de0ee"} Oct 01 13:15:49 crc kubenswrapper[4851]: I1001 13:15:49.971405 4851 scope.go:117] "RemoveContainer" containerID="0bca02248dc1c251e8985412a6a8c29bc91aaed972b3ba3d5795a5fa1e8b1be5" Oct 01 13:15:49 crc kubenswrapper[4851]: I1001 13:15:49.973077 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-combined-ca-bundle\") pod \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\" (UID: \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\") " Oct 01 13:15:49 crc kubenswrapper[4851]: I1001 13:15:49.973189 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr2bv\" (UniqueName: \"kubernetes.io/projected/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-kube-api-access-nr2bv\") pod \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\" (UID: \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\") " Oct 01 13:15:49 crc kubenswrapper[4851]: I1001 13:15:49.973384 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-config-data\") pod \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\" (UID: \"16a3f1fc-2d6b-4d3a-95c3-b36258749d53\") " Oct 01 13:15:49 crc kubenswrapper[4851]: I1001 13:15:49.978954 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-kube-api-access-nr2bv" (OuterVolumeSpecName: "kube-api-access-nr2bv") pod "16a3f1fc-2d6b-4d3a-95c3-b36258749d53" (UID: "16a3f1fc-2d6b-4d3a-95c3-b36258749d53"). InnerVolumeSpecName "kube-api-access-nr2bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.007124 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16a3f1fc-2d6b-4d3a-95c3-b36258749d53" (UID: "16a3f1fc-2d6b-4d3a-95c3-b36258749d53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.017324 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-config-data" (OuterVolumeSpecName: "config-data") pod "16a3f1fc-2d6b-4d3a-95c3-b36258749d53" (UID: "16a3f1fc-2d6b-4d3a-95c3-b36258749d53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.076414 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.076459 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.076480 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr2bv\" (UniqueName: \"kubernetes.io/projected/16a3f1fc-2d6b-4d3a-95c3-b36258749d53-kube-api-access-nr2bv\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.145208 4851 scope.go:117] "RemoveContainer" containerID="0bca02248dc1c251e8985412a6a8c29bc91aaed972b3ba3d5795a5fa1e8b1be5" Oct 01 13:15:50 crc kubenswrapper[4851]: E1001 13:15:50.145904 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bca02248dc1c251e8985412a6a8c29bc91aaed972b3ba3d5795a5fa1e8b1be5\": container with ID starting with 0bca02248dc1c251e8985412a6a8c29bc91aaed972b3ba3d5795a5fa1e8b1be5 not found: ID does not exist" containerID="0bca02248dc1c251e8985412a6a8c29bc91aaed972b3ba3d5795a5fa1e8b1be5" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.145960 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bca02248dc1c251e8985412a6a8c29bc91aaed972b3ba3d5795a5fa1e8b1be5"} err="failed to get container status \"0bca02248dc1c251e8985412a6a8c29bc91aaed972b3ba3d5795a5fa1e8b1be5\": rpc error: code = NotFound desc = could not find container \"0bca02248dc1c251e8985412a6a8c29bc91aaed972b3ba3d5795a5fa1e8b1be5\": container with ID starting with 0bca02248dc1c251e8985412a6a8c29bc91aaed972b3ba3d5795a5fa1e8b1be5 not found: ID does not exist" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.313916 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.326424 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.347901 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a3f1fc-2d6b-4d3a-95c3-b36258749d53" path="/var/lib/kubelet/pods/16a3f1fc-2d6b-4d3a-95c3-b36258749d53/volumes" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.351450 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:15:50 crc kubenswrapper[4851]: E1001 13:15:50.352343 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a3f1fc-2d6b-4d3a-95c3-b36258749d53" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.352392 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a3f1fc-2d6b-4d3a-95c3-b36258749d53" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.352947 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a3f1fc-2d6b-4d3a-95c3-b36258749d53" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.354495 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.359559 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.359652 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.359930 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.364288 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.487790 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab5594b-8bc0-4c89-b570-604b54931bbe-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.487871 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab5594b-8bc0-4c89-b570-604b54931bbe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.488042 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab5594b-8bc0-4c89-b570-604b54931bbe-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.488088 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsjtt\" (UniqueName: \"kubernetes.io/projected/1ab5594b-8bc0-4c89-b570-604b54931bbe-kube-api-access-nsjtt\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.488112 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab5594b-8bc0-4c89-b570-604b54931bbe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.590769 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab5594b-8bc0-4c89-b570-604b54931bbe-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.590847 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab5594b-8bc0-4c89-b570-604b54931bbe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.590960 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab5594b-8bc0-4c89-b570-604b54931bbe-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.591006 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsjtt\" (UniqueName: \"kubernetes.io/projected/1ab5594b-8bc0-4c89-b570-604b54931bbe-kube-api-access-nsjtt\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.591031 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab5594b-8bc0-4c89-b570-604b54931bbe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.600340 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab5594b-8bc0-4c89-b570-604b54931bbe-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.601120 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab5594b-8bc0-4c89-b570-604b54931bbe-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.601648 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab5594b-8bc0-4c89-b570-604b54931bbe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.601818 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab5594b-8bc0-4c89-b570-604b54931bbe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.609038 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsjtt\" (UniqueName: \"kubernetes.io/projected/1ab5594b-8bc0-4c89-b570-604b54931bbe-kube-api-access-nsjtt\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ab5594b-8bc0-4c89-b570-604b54931bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:50 crc kubenswrapper[4851]: I1001 13:15:50.696670 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:51 crc kubenswrapper[4851]: I1001 13:15:51.263521 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:15:52 crc kubenswrapper[4851]: I1001 13:15:52.008199 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1ab5594b-8bc0-4c89-b570-604b54931bbe","Type":"ContainerStarted","Data":"682d23b865030250b662e74ba6b100b907ebbe722aa251aaf815a988cf051bb7"} Oct 01 13:15:52 crc kubenswrapper[4851]: I1001 13:15:52.008852 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1ab5594b-8bc0-4c89-b570-604b54931bbe","Type":"ContainerStarted","Data":"2bda9decce42863f74843afeadac071bf910978c50d16444d81445941a34a552"} Oct 01 13:15:52 crc kubenswrapper[4851]: I1001 13:15:52.037398 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.037367378 podStartE2EDuration="2.037367378s" podCreationTimestamp="2025-10-01 13:15:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:15:52.029112093 +0000 UTC m=+1360.374229619" watchObservedRunningTime="2025-10-01 13:15:52.037367378 +0000 UTC m=+1360.382484904" Oct 01 13:15:53 crc kubenswrapper[4851]: I1001 13:15:53.049817 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 13:15:53 crc kubenswrapper[4851]: I1001 13:15:53.050462 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 13:15:53 crc kubenswrapper[4851]: I1001 13:15:53.050547 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 13:15:53 crc kubenswrapper[4851]: I1001 13:15:53.059698 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.033575 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.040597 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.230456 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74745b66cf-sh8zk"] Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.232099 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.250078 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74745b66cf-sh8zk"] Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.374939 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-config\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.375001 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dcx8\" (UniqueName: \"kubernetes.io/projected/10f182d6-67d2-4878-afcc-3e38d1f689aa-kube-api-access-7dcx8\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.375044 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-dns-svc\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.375078 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-dns-swift-storage-0\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.375219 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-ovsdbserver-nb\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.375593 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-ovsdbserver-sb\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.477462 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-ovsdbserver-sb\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.477629 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-config\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.477698 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dcx8\" (UniqueName: \"kubernetes.io/projected/10f182d6-67d2-4878-afcc-3e38d1f689aa-kube-api-access-7dcx8\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.477752 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-dns-svc\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.477806 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-dns-swift-storage-0\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.477865 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-ovsdbserver-nb\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.478741 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-config\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.478912 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-dns-swift-storage-0\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.478923 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-dns-svc\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.479176 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-ovsdbserver-nb\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.479589 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-ovsdbserver-sb\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.504164 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dcx8\" (UniqueName: \"kubernetes.io/projected/10f182d6-67d2-4878-afcc-3e38d1f689aa-kube-api-access-7dcx8\") pod \"dnsmasq-dns-74745b66cf-sh8zk\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:54 crc kubenswrapper[4851]: I1001 13:15:54.573010 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:55 crc kubenswrapper[4851]: I1001 13:15:55.142968 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74745b66cf-sh8zk"] Oct 01 13:15:55 crc kubenswrapper[4851]: I1001 13:15:55.696982 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:15:56 crc kubenswrapper[4851]: I1001 13:15:56.056927 4851 generic.go:334] "Generic (PLEG): container finished" podID="10f182d6-67d2-4878-afcc-3e38d1f689aa" containerID="42c86437fdfbd9a24222fa7007a183b2f7221b67a0de62b064dceca6b3020985" exitCode=0 Oct 01 13:15:56 crc kubenswrapper[4851]: I1001 13:15:56.057047 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" event={"ID":"10f182d6-67d2-4878-afcc-3e38d1f689aa","Type":"ContainerDied","Data":"42c86437fdfbd9a24222fa7007a183b2f7221b67a0de62b064dceca6b3020985"} Oct 01 13:15:56 crc kubenswrapper[4851]: I1001 13:15:56.057380 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" event={"ID":"10f182d6-67d2-4878-afcc-3e38d1f689aa","Type":"ContainerStarted","Data":"3a263e4100487e5bd2f83cb2095497991de5bc438178e14c967f60d592303115"} Oct 01 13:15:56 crc kubenswrapper[4851]: I1001 13:15:56.667478 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:15:56 crc kubenswrapper[4851]: I1001 13:15:56.853081 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:15:56 crc kubenswrapper[4851]: I1001 13:15:56.854586 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="ceilometer-central-agent" containerID="cri-o://d11cb0a0b8196f9e568c26db821234d4cbbbc013304e091b3f4129e96e63929f" gracePeriod=30 Oct 01 13:15:56 crc kubenswrapper[4851]: I1001 13:15:56.854860 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="proxy-httpd" containerID="cri-o://33be75013d251c492a7e9895df60369fddf8f399140bda47cb8159ceadd2de2a" gracePeriod=30 Oct 01 13:15:56 crc kubenswrapper[4851]: I1001 13:15:56.855005 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="sg-core" containerID="cri-o://eef80d698e37bc95f05daa2017406b124eb066e4fa7658d0e6ea804b73f00b59" gracePeriod=30 Oct 01 13:15:56 crc kubenswrapper[4851]: I1001 13:15:56.855160 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="ceilometer-notification-agent" containerID="cri-o://7210d3ee7aee58e182043a786847899de398eb05a67afad31a7f7c0dfcfd21e0" gracePeriod=30 Oct 01 13:15:56 crc kubenswrapper[4851]: I1001 13:15:56.883231 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 01 13:15:57 crc kubenswrapper[4851]: I1001 13:15:57.068733 4851 generic.go:334] "Generic (PLEG): container finished" podID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerID="eef80d698e37bc95f05daa2017406b124eb066e4fa7658d0e6ea804b73f00b59" exitCode=2 Oct 01 13:15:57 crc kubenswrapper[4851]: I1001 13:15:57.068831 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db4192e-1ee2-4dca-98b2-65991b966ebe","Type":"ContainerDied","Data":"eef80d698e37bc95f05daa2017406b124eb066e4fa7658d0e6ea804b73f00b59"} Oct 01 13:15:57 crc kubenswrapper[4851]: I1001 13:15:57.070461 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" event={"ID":"10f182d6-67d2-4878-afcc-3e38d1f689aa","Type":"ContainerStarted","Data":"311962d94f7a32ac08ddb66e50fd2304671fec1892f4c2ca64d9941e19d957df"} Oct 01 13:15:57 crc kubenswrapper[4851]: I1001 13:15:57.070603 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="97cbbb39-fa41-45f3-9d97-a9509e7983ae" containerName="nova-api-log" containerID="cri-o://304778a0f457a1dd5fe8201652857cd01f42b454123e4963eed28bf770620bbc" gracePeriod=30 Oct 01 13:15:57 crc kubenswrapper[4851]: I1001 13:15:57.070738 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="97cbbb39-fa41-45f3-9d97-a9509e7983ae" containerName="nova-api-api" containerID="cri-o://675bffdac6025adc8c329864bb268c6ce7ff9509988afd05e4e88b5e450e77f6" gracePeriod=30 Oct 01 13:15:57 crc kubenswrapper[4851]: I1001 13:15:57.104243 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" podStartSLOduration=3.104222962 podStartE2EDuration="3.104222962s" podCreationTimestamp="2025-10-01 13:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:15:57.09645925 +0000 UTC m=+1365.441576736" watchObservedRunningTime="2025-10-01 13:15:57.104222962 +0000 UTC m=+1365.449340448" Oct 01 13:15:58 crc kubenswrapper[4851]: I1001 13:15:58.082325 4851 generic.go:334] "Generic (PLEG): container finished" podID="97cbbb39-fa41-45f3-9d97-a9509e7983ae" containerID="304778a0f457a1dd5fe8201652857cd01f42b454123e4963eed28bf770620bbc" exitCode=143 Oct 01 13:15:58 crc kubenswrapper[4851]: I1001 13:15:58.082407 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97cbbb39-fa41-45f3-9d97-a9509e7983ae","Type":"ContainerDied","Data":"304778a0f457a1dd5fe8201652857cd01f42b454123e4963eed28bf770620bbc"} Oct 01 13:15:58 crc kubenswrapper[4851]: I1001 13:15:58.085282 4851 generic.go:334] "Generic (PLEG): container finished" podID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerID="33be75013d251c492a7e9895df60369fddf8f399140bda47cb8159ceadd2de2a" exitCode=0 Oct 01 13:15:58 crc kubenswrapper[4851]: I1001 13:15:58.085309 4851 generic.go:334] "Generic (PLEG): container finished" podID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerID="d11cb0a0b8196f9e568c26db821234d4cbbbc013304e091b3f4129e96e63929f" exitCode=0 Oct 01 13:15:58 crc kubenswrapper[4851]: I1001 13:15:58.085342 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db4192e-1ee2-4dca-98b2-65991b966ebe","Type":"ContainerDied","Data":"33be75013d251c492a7e9895df60369fddf8f399140bda47cb8159ceadd2de2a"} Oct 01 13:15:58 crc kubenswrapper[4851]: I1001 13:15:58.085375 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db4192e-1ee2-4dca-98b2-65991b966ebe","Type":"ContainerDied","Data":"d11cb0a0b8196f9e568c26db821234d4cbbbc013304e091b3f4129e96e63929f"} Oct 01 13:15:58 crc kubenswrapper[4851]: I1001 13:15:58.085474 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.095659 4851 generic.go:334] "Generic (PLEG): container finished" podID="97cbbb39-fa41-45f3-9d97-a9509e7983ae" containerID="675bffdac6025adc8c329864bb268c6ce7ff9509988afd05e4e88b5e450e77f6" exitCode=0 Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.095705 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97cbbb39-fa41-45f3-9d97-a9509e7983ae","Type":"ContainerDied","Data":"675bffdac6025adc8c329864bb268c6ce7ff9509988afd05e4e88b5e450e77f6"} Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.220176 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.391284 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97cbbb39-fa41-45f3-9d97-a9509e7983ae-logs\") pod \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.391429 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhqsv\" (UniqueName: \"kubernetes.io/projected/97cbbb39-fa41-45f3-9d97-a9509e7983ae-kube-api-access-dhqsv\") pod \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.391544 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cbbb39-fa41-45f3-9d97-a9509e7983ae-combined-ca-bundle\") pod \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.391698 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97cbbb39-fa41-45f3-9d97-a9509e7983ae-config-data\") pod \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\" (UID: \"97cbbb39-fa41-45f3-9d97-a9509e7983ae\") " Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.391991 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97cbbb39-fa41-45f3-9d97-a9509e7983ae-logs" (OuterVolumeSpecName: "logs") pod "97cbbb39-fa41-45f3-9d97-a9509e7983ae" (UID: "97cbbb39-fa41-45f3-9d97-a9509e7983ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.392249 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97cbbb39-fa41-45f3-9d97-a9509e7983ae-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.402632 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97cbbb39-fa41-45f3-9d97-a9509e7983ae-kube-api-access-dhqsv" (OuterVolumeSpecName: "kube-api-access-dhqsv") pod "97cbbb39-fa41-45f3-9d97-a9509e7983ae" (UID: "97cbbb39-fa41-45f3-9d97-a9509e7983ae"). InnerVolumeSpecName "kube-api-access-dhqsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.431329 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97cbbb39-fa41-45f3-9d97-a9509e7983ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97cbbb39-fa41-45f3-9d97-a9509e7983ae" (UID: "97cbbb39-fa41-45f3-9d97-a9509e7983ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.454947 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97cbbb39-fa41-45f3-9d97-a9509e7983ae-config-data" (OuterVolumeSpecName: "config-data") pod "97cbbb39-fa41-45f3-9d97-a9509e7983ae" (UID: "97cbbb39-fa41-45f3-9d97-a9509e7983ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.494944 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97cbbb39-fa41-45f3-9d97-a9509e7983ae-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.494969 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhqsv\" (UniqueName: \"kubernetes.io/projected/97cbbb39-fa41-45f3-9d97-a9509e7983ae-kube-api-access-dhqsv\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:59 crc kubenswrapper[4851]: I1001 13:15:59.494979 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cbbb39-fa41-45f3-9d97-a9509e7983ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.049723 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.049788 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.118920 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97cbbb39-fa41-45f3-9d97-a9509e7983ae","Type":"ContainerDied","Data":"64d10fb013c28a3f9d0dc8010dd05c2305843d8399153b81a1581b518a2d4ae4"} Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.119020 4851 scope.go:117] "RemoveContainer" containerID="675bffdac6025adc8c329864bb268c6ce7ff9509988afd05e4e88b5e450e77f6" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.119054 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.159779 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.175246 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.191115 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 13:16:00 crc kubenswrapper[4851]: E1001 13:16:00.191718 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97cbbb39-fa41-45f3-9d97-a9509e7983ae" containerName="nova-api-log" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.191794 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="97cbbb39-fa41-45f3-9d97-a9509e7983ae" containerName="nova-api-log" Oct 01 13:16:00 crc kubenswrapper[4851]: E1001 13:16:00.191888 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97cbbb39-fa41-45f3-9d97-a9509e7983ae" containerName="nova-api-api" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.191945 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="97cbbb39-fa41-45f3-9d97-a9509e7983ae" containerName="nova-api-api" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.192219 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="97cbbb39-fa41-45f3-9d97-a9509e7983ae" containerName="nova-api-api" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.192313 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="97cbbb39-fa41-45f3-9d97-a9509e7983ae" containerName="nova-api-log" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.195208 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.199225 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.199489 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.199728 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.200620 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.223201 4851 scope.go:117] "RemoveContainer" containerID="304778a0f457a1dd5fe8201652857cd01f42b454123e4963eed28bf770620bbc" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.310547 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.310606 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.310641 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-config-data\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.310981 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-public-tls-certs\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.311228 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgvpt\" (UniqueName: \"kubernetes.io/projected/d967224f-1bb3-4493-b3da-75fecbe641f8-kube-api-access-cgvpt\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.311262 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d967224f-1bb3-4493-b3da-75fecbe641f8-logs\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.346190 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97cbbb39-fa41-45f3-9d97-a9509e7983ae" path="/var/lib/kubelet/pods/97cbbb39-fa41-45f3-9d97-a9509e7983ae/volumes" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.413751 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgvpt\" (UniqueName: \"kubernetes.io/projected/d967224f-1bb3-4493-b3da-75fecbe641f8-kube-api-access-cgvpt\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.413845 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d967224f-1bb3-4493-b3da-75fecbe641f8-logs\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.413941 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.414033 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.414110 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-config-data\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.414380 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-public-tls-certs\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.415399 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d967224f-1bb3-4493-b3da-75fecbe641f8-logs\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.419139 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.419875 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-config-data\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.421113 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.422595 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-public-tls-certs\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.439415 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgvpt\" (UniqueName: \"kubernetes.io/projected/d967224f-1bb3-4493-b3da-75fecbe641f8-kube-api-access-cgvpt\") pod \"nova-api-0\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.518414 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.700870 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.719296 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:16:00 crc kubenswrapper[4851]: I1001 13:16:00.969980 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:16:00 crc kubenswrapper[4851]: W1001 13:16:00.981757 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd967224f_1bb3_4493_b3da_75fecbe641f8.slice/crio-eaed1e96ac968154cccfbd9eaefc17239b205532eedeff3687ee5177ae675aec WatchSource:0}: Error finding container eaed1e96ac968154cccfbd9eaefc17239b205532eedeff3687ee5177ae675aec: Status 404 returned error can't find the container with id eaed1e96ac968154cccfbd9eaefc17239b205532eedeff3687ee5177ae675aec Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.139757 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d967224f-1bb3-4493-b3da-75fecbe641f8","Type":"ContainerStarted","Data":"eaed1e96ac968154cccfbd9eaefc17239b205532eedeff3687ee5177ae675aec"} Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.159111 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.352754 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-krnh6"] Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.366318 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.369308 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.370663 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.371726 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-krnh6"] Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.467269 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg96l\" (UniqueName: \"kubernetes.io/projected/4e7c9248-7155-4f8b-a070-3374a93bc56b-kube-api-access-gg96l\") pod \"nova-cell1-cell-mapping-krnh6\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.467467 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-config-data\") pod \"nova-cell1-cell-mapping-krnh6\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.467518 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-krnh6\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.467590 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-scripts\") pod \"nova-cell1-cell-mapping-krnh6\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.579866 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-config-data\") pod \"nova-cell1-cell-mapping-krnh6\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.579918 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-krnh6\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.579989 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-scripts\") pod \"nova-cell1-cell-mapping-krnh6\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.580030 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg96l\" (UniqueName: \"kubernetes.io/projected/4e7c9248-7155-4f8b-a070-3374a93bc56b-kube-api-access-gg96l\") pod \"nova-cell1-cell-mapping-krnh6\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.584021 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-krnh6\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.584124 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-config-data\") pod \"nova-cell1-cell-mapping-krnh6\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.585775 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-scripts\") pod \"nova-cell1-cell-mapping-krnh6\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.606395 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg96l\" (UniqueName: \"kubernetes.io/projected/4e7c9248-7155-4f8b-a070-3374a93bc56b-kube-api-access-gg96l\") pod \"nova-cell1-cell-mapping-krnh6\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:01 crc kubenswrapper[4851]: I1001 13:16:01.694774 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:02 crc kubenswrapper[4851]: I1001 13:16:02.171456 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d967224f-1bb3-4493-b3da-75fecbe641f8","Type":"ContainerStarted","Data":"c5f3dfc42f20b40da1e581e4382d8106b1cdc183d51ce1411afbb0278d1ee60d"} Oct 01 13:16:02 crc kubenswrapper[4851]: I1001 13:16:02.171916 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d967224f-1bb3-4493-b3da-75fecbe641f8","Type":"ContainerStarted","Data":"e8f76d934c91c7e493ae3b5570bfc68d3ad4fe285237022e2db46b03651e181f"} Oct 01 13:16:02 crc kubenswrapper[4851]: I1001 13:16:02.177518 4851 generic.go:334] "Generic (PLEG): container finished" podID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerID="7210d3ee7aee58e182043a786847899de398eb05a67afad31a7f7c0dfcfd21e0" exitCode=0 Oct 01 13:16:02 crc kubenswrapper[4851]: I1001 13:16:02.177949 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db4192e-1ee2-4dca-98b2-65991b966ebe","Type":"ContainerDied","Data":"7210d3ee7aee58e182043a786847899de398eb05a67afad31a7f7c0dfcfd21e0"} Oct 01 13:16:02 crc kubenswrapper[4851]: I1001 13:16:02.227047 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.227025637 podStartE2EDuration="2.227025637s" podCreationTimestamp="2025-10-01 13:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:16:02.192915509 +0000 UTC m=+1370.538033005" watchObservedRunningTime="2025-10-01 13:16:02.227025637 +0000 UTC m=+1370.572143123" Oct 01 13:16:02 crc kubenswrapper[4851]: I1001 13:16:02.228842 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-krnh6"] Oct 01 13:16:02 crc kubenswrapper[4851]: W1001 13:16:02.243698 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e7c9248_7155_4f8b_a070_3374a93bc56b.slice/crio-fc9c6a25576324d2d8a756746a9b9a1ae9de48adf5677ec238f15ecedafb0aad WatchSource:0}: Error finding container fc9c6a25576324d2d8a756746a9b9a1ae9de48adf5677ec238f15ecedafb0aad: Status 404 returned error can't find the container with id fc9c6a25576324d2d8a756746a9b9a1ae9de48adf5677ec238f15ecedafb0aad Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.052214 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.111409 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-combined-ca-bundle\") pod \"3db4192e-1ee2-4dca-98b2-65991b966ebe\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.111552 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db4192e-1ee2-4dca-98b2-65991b966ebe-run-httpd\") pod \"3db4192e-1ee2-4dca-98b2-65991b966ebe\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.111598 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db4192e-1ee2-4dca-98b2-65991b966ebe-log-httpd\") pod \"3db4192e-1ee2-4dca-98b2-65991b966ebe\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.111727 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-ceilometer-tls-certs\") pod \"3db4192e-1ee2-4dca-98b2-65991b966ebe\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.111840 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9kb6\" (UniqueName: \"kubernetes.io/projected/3db4192e-1ee2-4dca-98b2-65991b966ebe-kube-api-access-d9kb6\") pod \"3db4192e-1ee2-4dca-98b2-65991b966ebe\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.111899 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-scripts\") pod \"3db4192e-1ee2-4dca-98b2-65991b966ebe\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.111992 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-sg-core-conf-yaml\") pod \"3db4192e-1ee2-4dca-98b2-65991b966ebe\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.112088 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-config-data\") pod \"3db4192e-1ee2-4dca-98b2-65991b966ebe\" (UID: \"3db4192e-1ee2-4dca-98b2-65991b966ebe\") " Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.123732 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db4192e-1ee2-4dca-98b2-65991b966ebe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3db4192e-1ee2-4dca-98b2-65991b966ebe" (UID: "3db4192e-1ee2-4dca-98b2-65991b966ebe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.123771 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db4192e-1ee2-4dca-98b2-65991b966ebe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3db4192e-1ee2-4dca-98b2-65991b966ebe" (UID: "3db4192e-1ee2-4dca-98b2-65991b966ebe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.129671 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db4192e-1ee2-4dca-98b2-65991b966ebe-kube-api-access-d9kb6" (OuterVolumeSpecName: "kube-api-access-d9kb6") pod "3db4192e-1ee2-4dca-98b2-65991b966ebe" (UID: "3db4192e-1ee2-4dca-98b2-65991b966ebe"). InnerVolumeSpecName "kube-api-access-d9kb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.137678 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-scripts" (OuterVolumeSpecName: "scripts") pod "3db4192e-1ee2-4dca-98b2-65991b966ebe" (UID: "3db4192e-1ee2-4dca-98b2-65991b966ebe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.181238 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3db4192e-1ee2-4dca-98b2-65991b966ebe" (UID: "3db4192e-1ee2-4dca-98b2-65991b966ebe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.192036 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-krnh6" event={"ID":"4e7c9248-7155-4f8b-a070-3374a93bc56b","Type":"ContainerStarted","Data":"48567905938e95dcebd6cf2bd21f5a663d779580c6cf0f0e7c1b928816e4a449"} Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.192098 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-krnh6" event={"ID":"4e7c9248-7155-4f8b-a070-3374a93bc56b","Type":"ContainerStarted","Data":"fc9c6a25576324d2d8a756746a9b9a1ae9de48adf5677ec238f15ecedafb0aad"} Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.209459 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db4192e-1ee2-4dca-98b2-65991b966ebe","Type":"ContainerDied","Data":"c2af9936437fbd909071572649c3145f939ff3c62dce4da04b7b3044cf3e7f52"} Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.209464 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-krnh6" podStartSLOduration=2.209448284 podStartE2EDuration="2.209448284s" podCreationTimestamp="2025-10-01 13:16:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:16:03.205231309 +0000 UTC m=+1371.550348795" watchObservedRunningTime="2025-10-01 13:16:03.209448284 +0000 UTC m=+1371.554565770" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.209525 4851 scope.go:117] "RemoveContainer" containerID="33be75013d251c492a7e9895df60369fddf8f399140bda47cb8159ceadd2de2a" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.210295 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.216821 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9kb6\" (UniqueName: \"kubernetes.io/projected/3db4192e-1ee2-4dca-98b2-65991b966ebe-kube-api-access-d9kb6\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.216952 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.217006 4851 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.217119 4851 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db4192e-1ee2-4dca-98b2-65991b966ebe-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.217204 4851 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db4192e-1ee2-4dca-98b2-65991b966ebe-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.218902 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3db4192e-1ee2-4dca-98b2-65991b966ebe" (UID: "3db4192e-1ee2-4dca-98b2-65991b966ebe"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.233212 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3db4192e-1ee2-4dca-98b2-65991b966ebe" (UID: "3db4192e-1ee2-4dca-98b2-65991b966ebe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.261323 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-config-data" (OuterVolumeSpecName: "config-data") pod "3db4192e-1ee2-4dca-98b2-65991b966ebe" (UID: "3db4192e-1ee2-4dca-98b2-65991b966ebe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.289594 4851 scope.go:117] "RemoveContainer" containerID="eef80d698e37bc95f05daa2017406b124eb066e4fa7658d0e6ea804b73f00b59" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.319660 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.319703 4851 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.319717 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db4192e-1ee2-4dca-98b2-65991b966ebe-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.323738 4851 scope.go:117] "RemoveContainer" containerID="7210d3ee7aee58e182043a786847899de398eb05a67afad31a7f7c0dfcfd21e0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.350520 4851 scope.go:117] "RemoveContainer" containerID="d11cb0a0b8196f9e568c26db821234d4cbbbc013304e091b3f4129e96e63929f" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.554994 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.577346 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.588693 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:16:03 crc kubenswrapper[4851]: E1001 13:16:03.589371 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="ceilometer-notification-agent" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.589399 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="ceilometer-notification-agent" Oct 01 13:16:03 crc kubenswrapper[4851]: E1001 13:16:03.589425 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="sg-core" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.589436 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="sg-core" Oct 01 13:16:03 crc kubenswrapper[4851]: E1001 13:16:03.589467 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="ceilometer-central-agent" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.589476 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="ceilometer-central-agent" Oct 01 13:16:03 crc kubenswrapper[4851]: E1001 13:16:03.589518 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="proxy-httpd" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.589531 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="proxy-httpd" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.589787 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="ceilometer-notification-agent" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.589830 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="ceilometer-central-agent" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.589842 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="sg-core" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.589857 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" containerName="proxy-httpd" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.592250 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.599980 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.599994 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.600532 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.631594 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.729858 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-config-data\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.730002 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b04518a-1699-4ef1-8b54-57c7343e081c-run-httpd\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.730063 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b04518a-1699-4ef1-8b54-57c7343e081c-log-httpd\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.730146 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.730179 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.730216 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-scripts\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.730356 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.730395 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4n9k\" (UniqueName: \"kubernetes.io/projected/5b04518a-1699-4ef1-8b54-57c7343e081c-kube-api-access-q4n9k\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.831912 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-config-data\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.832002 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b04518a-1699-4ef1-8b54-57c7343e081c-run-httpd\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.832039 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b04518a-1699-4ef1-8b54-57c7343e081c-log-httpd\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.832087 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.832108 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.832133 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-scripts\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.832213 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.832239 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4n9k\" (UniqueName: \"kubernetes.io/projected/5b04518a-1699-4ef1-8b54-57c7343e081c-kube-api-access-q4n9k\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.834568 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b04518a-1699-4ef1-8b54-57c7343e081c-run-httpd\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.834710 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b04518a-1699-4ef1-8b54-57c7343e081c-log-httpd\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.836289 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-scripts\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.837422 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-config-data\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.838322 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.838784 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.839475 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b04518a-1699-4ef1-8b54-57c7343e081c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.860732 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4n9k\" (UniqueName: \"kubernetes.io/projected/5b04518a-1699-4ef1-8b54-57c7343e081c-kube-api-access-q4n9k\") pod \"ceilometer-0\" (UID: \"5b04518a-1699-4ef1-8b54-57c7343e081c\") " pod="openstack/ceilometer-0" Oct 01 13:16:03 crc kubenswrapper[4851]: I1001 13:16:03.914432 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:16:04 crc kubenswrapper[4851]: I1001 13:16:04.341455 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db4192e-1ee2-4dca-98b2-65991b966ebe" path="/var/lib/kubelet/pods/3db4192e-1ee2-4dca-98b2-65991b966ebe/volumes" Oct 01 13:16:04 crc kubenswrapper[4851]: I1001 13:16:04.395947 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:16:04 crc kubenswrapper[4851]: W1001 13:16:04.396892 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b04518a_1699_4ef1_8b54_57c7343e081c.slice/crio-7035f8abca9f434f815872d16a3b3e3cb3511a8d9b21b1e0ae11a87b9458ddd4 WatchSource:0}: Error finding container 7035f8abca9f434f815872d16a3b3e3cb3511a8d9b21b1e0ae11a87b9458ddd4: Status 404 returned error can't find the container with id 7035f8abca9f434f815872d16a3b3e3cb3511a8d9b21b1e0ae11a87b9458ddd4 Oct 01 13:16:04 crc kubenswrapper[4851]: I1001 13:16:04.574679 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:16:04 crc kubenswrapper[4851]: I1001 13:16:04.658780 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c4f994dc9-lkgs9"] Oct 01 13:16:04 crc kubenswrapper[4851]: I1001 13:16:04.659046 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" podUID="6db851a0-f990-44cf-8766-ef6891037e3a" containerName="dnsmasq-dns" containerID="cri-o://b037c68ef909e110d913e388097f52e92456ceaa1bfae19d7a32debb805a6fec" gracePeriod=10 Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.186152 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.245595 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b04518a-1699-4ef1-8b54-57c7343e081c","Type":"ContainerStarted","Data":"c16241604d04295b02783093a64579e633ec0241d681cdd451eea945d103865a"} Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.245840 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b04518a-1699-4ef1-8b54-57c7343e081c","Type":"ContainerStarted","Data":"7035f8abca9f434f815872d16a3b3e3cb3511a8d9b21b1e0ae11a87b9458ddd4"} Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.248326 4851 generic.go:334] "Generic (PLEG): container finished" podID="6db851a0-f990-44cf-8766-ef6891037e3a" containerID="b037c68ef909e110d913e388097f52e92456ceaa1bfae19d7a32debb805a6fec" exitCode=0 Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.248366 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" event={"ID":"6db851a0-f990-44cf-8766-ef6891037e3a","Type":"ContainerDied","Data":"b037c68ef909e110d913e388097f52e92456ceaa1bfae19d7a32debb805a6fec"} Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.248390 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" event={"ID":"6db851a0-f990-44cf-8766-ef6891037e3a","Type":"ContainerDied","Data":"47e6ed57dbd8c6e3383c2752c6d06ecdc82460f3a1428a81add15357187e15fa"} Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.248407 4851 scope.go:117] "RemoveContainer" containerID="b037c68ef909e110d913e388097f52e92456ceaa1bfae19d7a32debb805a6fec" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.248543 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.263823 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-config\") pod \"6db851a0-f990-44cf-8766-ef6891037e3a\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.263967 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-dns-swift-storage-0\") pod \"6db851a0-f990-44cf-8766-ef6891037e3a\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.264026 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-ovsdbserver-sb\") pod \"6db851a0-f990-44cf-8766-ef6891037e3a\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.264102 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbdp6\" (UniqueName: \"kubernetes.io/projected/6db851a0-f990-44cf-8766-ef6891037e3a-kube-api-access-mbdp6\") pod \"6db851a0-f990-44cf-8766-ef6891037e3a\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.264239 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-dns-svc\") pod \"6db851a0-f990-44cf-8766-ef6891037e3a\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.264285 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-ovsdbserver-nb\") pod \"6db851a0-f990-44cf-8766-ef6891037e3a\" (UID: \"6db851a0-f990-44cf-8766-ef6891037e3a\") " Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.275832 4851 scope.go:117] "RemoveContainer" containerID="e7094989432d343cdf41b68eb5a16f64934aae57c5926f1777802cf768a8932b" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.278321 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6db851a0-f990-44cf-8766-ef6891037e3a-kube-api-access-mbdp6" (OuterVolumeSpecName: "kube-api-access-mbdp6") pod "6db851a0-f990-44cf-8766-ef6891037e3a" (UID: "6db851a0-f990-44cf-8766-ef6891037e3a"). InnerVolumeSpecName "kube-api-access-mbdp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.324768 4851 scope.go:117] "RemoveContainer" containerID="b037c68ef909e110d913e388097f52e92456ceaa1bfae19d7a32debb805a6fec" Oct 01 13:16:05 crc kubenswrapper[4851]: E1001 13:16:05.326749 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b037c68ef909e110d913e388097f52e92456ceaa1bfae19d7a32debb805a6fec\": container with ID starting with b037c68ef909e110d913e388097f52e92456ceaa1bfae19d7a32debb805a6fec not found: ID does not exist" containerID="b037c68ef909e110d913e388097f52e92456ceaa1bfae19d7a32debb805a6fec" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.326788 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b037c68ef909e110d913e388097f52e92456ceaa1bfae19d7a32debb805a6fec"} err="failed to get container status \"b037c68ef909e110d913e388097f52e92456ceaa1bfae19d7a32debb805a6fec\": rpc error: code = NotFound desc = could not find container \"b037c68ef909e110d913e388097f52e92456ceaa1bfae19d7a32debb805a6fec\": container with ID starting with b037c68ef909e110d913e388097f52e92456ceaa1bfae19d7a32debb805a6fec not found: ID does not exist" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.326810 4851 scope.go:117] "RemoveContainer" containerID="e7094989432d343cdf41b68eb5a16f64934aae57c5926f1777802cf768a8932b" Oct 01 13:16:05 crc kubenswrapper[4851]: E1001 13:16:05.327331 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7094989432d343cdf41b68eb5a16f64934aae57c5926f1777802cf768a8932b\": container with ID starting with e7094989432d343cdf41b68eb5a16f64934aae57c5926f1777802cf768a8932b not found: ID does not exist" containerID="e7094989432d343cdf41b68eb5a16f64934aae57c5926f1777802cf768a8932b" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.327354 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7094989432d343cdf41b68eb5a16f64934aae57c5926f1777802cf768a8932b"} err="failed to get container status \"e7094989432d343cdf41b68eb5a16f64934aae57c5926f1777802cf768a8932b\": rpc error: code = NotFound desc = could not find container \"e7094989432d343cdf41b68eb5a16f64934aae57c5926f1777802cf768a8932b\": container with ID starting with e7094989432d343cdf41b68eb5a16f64934aae57c5926f1777802cf768a8932b not found: ID does not exist" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.329102 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-config" (OuterVolumeSpecName: "config") pod "6db851a0-f990-44cf-8766-ef6891037e3a" (UID: "6db851a0-f990-44cf-8766-ef6891037e3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.330654 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6db851a0-f990-44cf-8766-ef6891037e3a" (UID: "6db851a0-f990-44cf-8766-ef6891037e3a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.336792 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6db851a0-f990-44cf-8766-ef6891037e3a" (UID: "6db851a0-f990-44cf-8766-ef6891037e3a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.343336 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6db851a0-f990-44cf-8766-ef6891037e3a" (UID: "6db851a0-f990-44cf-8766-ef6891037e3a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.362638 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6db851a0-f990-44cf-8766-ef6891037e3a" (UID: "6db851a0-f990-44cf-8766-ef6891037e3a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.366422 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.366454 4851 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.366465 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.366474 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbdp6\" (UniqueName: \"kubernetes.io/projected/6db851a0-f990-44cf-8766-ef6891037e3a-kube-api-access-mbdp6\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.366482 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.366490 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6db851a0-f990-44cf-8766-ef6891037e3a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.581866 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c4f994dc9-lkgs9"] Oct 01 13:16:05 crc kubenswrapper[4851]: I1001 13:16:05.589207 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c4f994dc9-lkgs9"] Oct 01 13:16:06 crc kubenswrapper[4851]: I1001 13:16:06.260648 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b04518a-1699-4ef1-8b54-57c7343e081c","Type":"ContainerStarted","Data":"ff5712d1ba804acc6477dc364f6405daa960c777fc28183618ebc0428e2851c5"} Oct 01 13:16:06 crc kubenswrapper[4851]: I1001 13:16:06.261011 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b04518a-1699-4ef1-8b54-57c7343e081c","Type":"ContainerStarted","Data":"4b23480f46b90202686557ef12a4fb580f95ad895e98b05fc0048c21124e67fd"} Oct 01 13:16:06 crc kubenswrapper[4851]: I1001 13:16:06.340365 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6db851a0-f990-44cf-8766-ef6891037e3a" path="/var/lib/kubelet/pods/6db851a0-f990-44cf-8766-ef6891037e3a/volumes" Oct 01 13:16:08 crc kubenswrapper[4851]: I1001 13:16:08.303896 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b04518a-1699-4ef1-8b54-57c7343e081c","Type":"ContainerStarted","Data":"6b08f645e87135f278ffd1557b39b421455a1aeff6c970312c41fac3725505ae"} Oct 01 13:16:08 crc kubenswrapper[4851]: I1001 13:16:08.304640 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:16:08 crc kubenswrapper[4851]: I1001 13:16:08.307729 4851 generic.go:334] "Generic (PLEG): container finished" podID="4e7c9248-7155-4f8b-a070-3374a93bc56b" containerID="48567905938e95dcebd6cf2bd21f5a663d779580c6cf0f0e7c1b928816e4a449" exitCode=0 Oct 01 13:16:08 crc kubenswrapper[4851]: I1001 13:16:08.307789 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-krnh6" event={"ID":"4e7c9248-7155-4f8b-a070-3374a93bc56b","Type":"ContainerDied","Data":"48567905938e95dcebd6cf2bd21f5a663d779580c6cf0f0e7c1b928816e4a449"} Oct 01 13:16:08 crc kubenswrapper[4851]: I1001 13:16:08.347797 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.29196352 podStartE2EDuration="5.347777383s" podCreationTimestamp="2025-10-01 13:16:03 +0000 UTC" firstStartedPulling="2025-10-01 13:16:04.399360618 +0000 UTC m=+1372.744478104" lastFinishedPulling="2025-10-01 13:16:07.455174481 +0000 UTC m=+1375.800291967" observedRunningTime="2025-10-01 13:16:08.341604425 +0000 UTC m=+1376.686721911" watchObservedRunningTime="2025-10-01 13:16:08.347777383 +0000 UTC m=+1376.692894869" Oct 01 13:16:09 crc kubenswrapper[4851]: I1001 13:16:09.866675 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:09 crc kubenswrapper[4851]: I1001 13:16:09.958840 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-config-data\") pod \"4e7c9248-7155-4f8b-a070-3374a93bc56b\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " Oct 01 13:16:09 crc kubenswrapper[4851]: I1001 13:16:09.959343 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg96l\" (UniqueName: \"kubernetes.io/projected/4e7c9248-7155-4f8b-a070-3374a93bc56b-kube-api-access-gg96l\") pod \"4e7c9248-7155-4f8b-a070-3374a93bc56b\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " Oct 01 13:16:09 crc kubenswrapper[4851]: I1001 13:16:09.959449 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-scripts\") pod \"4e7c9248-7155-4f8b-a070-3374a93bc56b\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " Oct 01 13:16:09 crc kubenswrapper[4851]: I1001 13:16:09.959530 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-combined-ca-bundle\") pod \"4e7c9248-7155-4f8b-a070-3374a93bc56b\" (UID: \"4e7c9248-7155-4f8b-a070-3374a93bc56b\") " Oct 01 13:16:09 crc kubenswrapper[4851]: I1001 13:16:09.967111 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e7c9248-7155-4f8b-a070-3374a93bc56b-kube-api-access-gg96l" (OuterVolumeSpecName: "kube-api-access-gg96l") pod "4e7c9248-7155-4f8b-a070-3374a93bc56b" (UID: "4e7c9248-7155-4f8b-a070-3374a93bc56b"). InnerVolumeSpecName "kube-api-access-gg96l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:09 crc kubenswrapper[4851]: I1001 13:16:09.967722 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-scripts" (OuterVolumeSpecName: "scripts") pod "4e7c9248-7155-4f8b-a070-3374a93bc56b" (UID: "4e7c9248-7155-4f8b-a070-3374a93bc56b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.008259 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-config-data" (OuterVolumeSpecName: "config-data") pod "4e7c9248-7155-4f8b-a070-3374a93bc56b" (UID: "4e7c9248-7155-4f8b-a070-3374a93bc56b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.017922 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e7c9248-7155-4f8b-a070-3374a93bc56b" (UID: "4e7c9248-7155-4f8b-a070-3374a93bc56b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.062703 4851 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.062747 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.062766 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e7c9248-7155-4f8b-a070-3374a93bc56b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.062779 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg96l\" (UniqueName: \"kubernetes.io/projected/4e7c9248-7155-4f8b-a070-3374a93bc56b-kube-api-access-gg96l\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.177677 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6c4f994dc9-lkgs9" podUID="6db851a0-f990-44cf-8766-ef6891037e3a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.207:5353: i/o timeout" Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.332828 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-krnh6" Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.348662 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-krnh6" event={"ID":"4e7c9248-7155-4f8b-a070-3374a93bc56b","Type":"ContainerDied","Data":"fc9c6a25576324d2d8a756746a9b9a1ae9de48adf5677ec238f15ecedafb0aad"} Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.348721 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc9c6a25576324d2d8a756746a9b9a1ae9de48adf5677ec238f15ecedafb0aad" Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.518919 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.518968 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.562746 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.574791 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.575046 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="58cd22d1-6ef7-4719-bf22-8658c400786c" containerName="nova-scheduler-scheduler" containerID="cri-o://7c598c2dc9bd731a9c1659e63f57a189fd4d020c746a568812112b0d92d7a536" gracePeriod=30 Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.610844 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.611106 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="68393f76-bfd1-40fc-b0e7-01bd567a657e" containerName="nova-metadata-log" containerID="cri-o://363495ed94168b195c1ebffa2e9494d2399e56656a01d28b9b93f3e13406b579" gracePeriod=30 Oct 01 13:16:10 crc kubenswrapper[4851]: I1001 13:16:10.611262 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="68393f76-bfd1-40fc-b0e7-01bd567a657e" containerName="nova-metadata-metadata" containerID="cri-o://e2980cda0bf29083305c6ab5e378b29e91db010a3c7bc1aba36d3d0f3d34cbd9" gracePeriod=30 Oct 01 13:16:11 crc kubenswrapper[4851]: I1001 13:16:11.346005 4851 generic.go:334] "Generic (PLEG): container finished" podID="68393f76-bfd1-40fc-b0e7-01bd567a657e" containerID="363495ed94168b195c1ebffa2e9494d2399e56656a01d28b9b93f3e13406b579" exitCode=143 Oct 01 13:16:11 crc kubenswrapper[4851]: I1001 13:16:11.346729 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d967224f-1bb3-4493-b3da-75fecbe641f8" containerName="nova-api-log" containerID="cri-o://e8f76d934c91c7e493ae3b5570bfc68d3ad4fe285237022e2db46b03651e181f" gracePeriod=30 Oct 01 13:16:11 crc kubenswrapper[4851]: I1001 13:16:11.346828 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"68393f76-bfd1-40fc-b0e7-01bd567a657e","Type":"ContainerDied","Data":"363495ed94168b195c1ebffa2e9494d2399e56656a01d28b9b93f3e13406b579"} Oct 01 13:16:11 crc kubenswrapper[4851]: I1001 13:16:11.347323 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d967224f-1bb3-4493-b3da-75fecbe641f8" containerName="nova-api-api" containerID="cri-o://c5f3dfc42f20b40da1e581e4382d8106b1cdc183d51ce1411afbb0278d1ee60d" gracePeriod=30 Oct 01 13:16:11 crc kubenswrapper[4851]: I1001 13:16:11.352902 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d967224f-1bb3-4493-b3da-75fecbe641f8" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.219:8774/\": EOF" Oct 01 13:16:11 crc kubenswrapper[4851]: I1001 13:16:11.352946 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d967224f-1bb3-4493-b3da-75fecbe641f8" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.219:8774/\": EOF" Oct 01 13:16:11 crc kubenswrapper[4851]: I1001 13:16:11.935114 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.031244 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86l8l\" (UniqueName: \"kubernetes.io/projected/68393f76-bfd1-40fc-b0e7-01bd567a657e-kube-api-access-86l8l\") pod \"68393f76-bfd1-40fc-b0e7-01bd567a657e\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.031607 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-nova-metadata-tls-certs\") pod \"68393f76-bfd1-40fc-b0e7-01bd567a657e\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.031835 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68393f76-bfd1-40fc-b0e7-01bd567a657e-logs\") pod \"68393f76-bfd1-40fc-b0e7-01bd567a657e\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.031890 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-config-data\") pod \"68393f76-bfd1-40fc-b0e7-01bd567a657e\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.031914 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-combined-ca-bundle\") pod \"68393f76-bfd1-40fc-b0e7-01bd567a657e\" (UID: \"68393f76-bfd1-40fc-b0e7-01bd567a657e\") " Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.033422 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68393f76-bfd1-40fc-b0e7-01bd567a657e-logs" (OuterVolumeSpecName: "logs") pod "68393f76-bfd1-40fc-b0e7-01bd567a657e" (UID: "68393f76-bfd1-40fc-b0e7-01bd567a657e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.055099 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68393f76-bfd1-40fc-b0e7-01bd567a657e-kube-api-access-86l8l" (OuterVolumeSpecName: "kube-api-access-86l8l") pod "68393f76-bfd1-40fc-b0e7-01bd567a657e" (UID: "68393f76-bfd1-40fc-b0e7-01bd567a657e"). InnerVolumeSpecName "kube-api-access-86l8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.089599 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68393f76-bfd1-40fc-b0e7-01bd567a657e" (UID: "68393f76-bfd1-40fc-b0e7-01bd567a657e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.090150 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "68393f76-bfd1-40fc-b0e7-01bd567a657e" (UID: "68393f76-bfd1-40fc-b0e7-01bd567a657e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.093115 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-config-data" (OuterVolumeSpecName: "config-data") pod "68393f76-bfd1-40fc-b0e7-01bd567a657e" (UID: "68393f76-bfd1-40fc-b0e7-01bd567a657e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.136868 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86l8l\" (UniqueName: \"kubernetes.io/projected/68393f76-bfd1-40fc-b0e7-01bd567a657e-kube-api-access-86l8l\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.136922 4851 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.136935 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68393f76-bfd1-40fc-b0e7-01bd567a657e-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.136950 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.136964 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68393f76-bfd1-40fc-b0e7-01bd567a657e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.359860 4851 generic.go:334] "Generic (PLEG): container finished" podID="d967224f-1bb3-4493-b3da-75fecbe641f8" containerID="e8f76d934c91c7e493ae3b5570bfc68d3ad4fe285237022e2db46b03651e181f" exitCode=143 Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.359935 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d967224f-1bb3-4493-b3da-75fecbe641f8","Type":"ContainerDied","Data":"e8f76d934c91c7e493ae3b5570bfc68d3ad4fe285237022e2db46b03651e181f"} Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.364094 4851 generic.go:334] "Generic (PLEG): container finished" podID="68393f76-bfd1-40fc-b0e7-01bd567a657e" containerID="e2980cda0bf29083305c6ab5e378b29e91db010a3c7bc1aba36d3d0f3d34cbd9" exitCode=0 Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.364128 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"68393f76-bfd1-40fc-b0e7-01bd567a657e","Type":"ContainerDied","Data":"e2980cda0bf29083305c6ab5e378b29e91db010a3c7bc1aba36d3d0f3d34cbd9"} Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.364150 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"68393f76-bfd1-40fc-b0e7-01bd567a657e","Type":"ContainerDied","Data":"d3c40141346fba4e22c0b27631d452fd2b63df625157e4c71b55ffe79f5a2146"} Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.364168 4851 scope.go:117] "RemoveContainer" containerID="e2980cda0bf29083305c6ab5e378b29e91db010a3c7bc1aba36d3d0f3d34cbd9" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.364169 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.432252 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.432587 4851 scope.go:117] "RemoveContainer" containerID="363495ed94168b195c1ebffa2e9494d2399e56656a01d28b9b93f3e13406b579" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.441843 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.460154 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:16:12 crc kubenswrapper[4851]: E1001 13:16:12.460588 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68393f76-bfd1-40fc-b0e7-01bd567a657e" containerName="nova-metadata-log" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.460608 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="68393f76-bfd1-40fc-b0e7-01bd567a657e" containerName="nova-metadata-log" Oct 01 13:16:12 crc kubenswrapper[4851]: E1001 13:16:12.460633 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e7c9248-7155-4f8b-a070-3374a93bc56b" containerName="nova-manage" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.460639 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e7c9248-7155-4f8b-a070-3374a93bc56b" containerName="nova-manage" Oct 01 13:16:12 crc kubenswrapper[4851]: E1001 13:16:12.460656 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db851a0-f990-44cf-8766-ef6891037e3a" containerName="dnsmasq-dns" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.460662 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db851a0-f990-44cf-8766-ef6891037e3a" containerName="dnsmasq-dns" Oct 01 13:16:12 crc kubenswrapper[4851]: E1001 13:16:12.460675 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68393f76-bfd1-40fc-b0e7-01bd567a657e" containerName="nova-metadata-metadata" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.460681 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="68393f76-bfd1-40fc-b0e7-01bd567a657e" containerName="nova-metadata-metadata" Oct 01 13:16:12 crc kubenswrapper[4851]: E1001 13:16:12.460696 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db851a0-f990-44cf-8766-ef6891037e3a" containerName="init" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.460701 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db851a0-f990-44cf-8766-ef6891037e3a" containerName="init" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.460895 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e7c9248-7155-4f8b-a070-3374a93bc56b" containerName="nova-manage" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.460911 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="68393f76-bfd1-40fc-b0e7-01bd567a657e" containerName="nova-metadata-metadata" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.460926 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db851a0-f990-44cf-8766-ef6891037e3a" containerName="dnsmasq-dns" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.460944 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="68393f76-bfd1-40fc-b0e7-01bd567a657e" containerName="nova-metadata-log" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.461899 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.464953 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.466601 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.502101 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.548824 4851 scope.go:117] "RemoveContainer" containerID="e2980cda0bf29083305c6ab5e378b29e91db010a3c7bc1aba36d3d0f3d34cbd9" Oct 01 13:16:12 crc kubenswrapper[4851]: E1001 13:16:12.550756 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2980cda0bf29083305c6ab5e378b29e91db010a3c7bc1aba36d3d0f3d34cbd9\": container with ID starting with e2980cda0bf29083305c6ab5e378b29e91db010a3c7bc1aba36d3d0f3d34cbd9 not found: ID does not exist" containerID="e2980cda0bf29083305c6ab5e378b29e91db010a3c7bc1aba36d3d0f3d34cbd9" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.550806 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2980cda0bf29083305c6ab5e378b29e91db010a3c7bc1aba36d3d0f3d34cbd9"} err="failed to get container status \"e2980cda0bf29083305c6ab5e378b29e91db010a3c7bc1aba36d3d0f3d34cbd9\": rpc error: code = NotFound desc = could not find container \"e2980cda0bf29083305c6ab5e378b29e91db010a3c7bc1aba36d3d0f3d34cbd9\": container with ID starting with e2980cda0bf29083305c6ab5e378b29e91db010a3c7bc1aba36d3d0f3d34cbd9 not found: ID does not exist" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.550834 4851 scope.go:117] "RemoveContainer" containerID="363495ed94168b195c1ebffa2e9494d2399e56656a01d28b9b93f3e13406b579" Oct 01 13:16:12 crc kubenswrapper[4851]: E1001 13:16:12.551309 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"363495ed94168b195c1ebffa2e9494d2399e56656a01d28b9b93f3e13406b579\": container with ID starting with 363495ed94168b195c1ebffa2e9494d2399e56656a01d28b9b93f3e13406b579 not found: ID does not exist" containerID="363495ed94168b195c1ebffa2e9494d2399e56656a01d28b9b93f3e13406b579" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.551350 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"363495ed94168b195c1ebffa2e9494d2399e56656a01d28b9b93f3e13406b579"} err="failed to get container status \"363495ed94168b195c1ebffa2e9494d2399e56656a01d28b9b93f3e13406b579\": rpc error: code = NotFound desc = could not find container \"363495ed94168b195c1ebffa2e9494d2399e56656a01d28b9b93f3e13406b579\": container with ID starting with 363495ed94168b195c1ebffa2e9494d2399e56656a01d28b9b93f3e13406b579 not found: ID does not exist" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.645556 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.645940 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6b4l\" (UniqueName: \"kubernetes.io/projected/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-kube-api-access-q6b4l\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.645998 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-config-data\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.646044 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-logs\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.646177 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.747970 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-logs\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.748066 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.748177 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.748220 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6b4l\" (UniqueName: \"kubernetes.io/projected/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-kube-api-access-q6b4l\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.748293 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-config-data\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.748597 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-logs\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.752297 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.752926 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-config-data\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.761926 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.767565 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6b4l\" (UniqueName: \"kubernetes.io/projected/f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522-kube-api-access-q6b4l\") pod \"nova-metadata-0\" (UID: \"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522\") " pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.840104 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:16:12 crc kubenswrapper[4851]: I1001 13:16:12.965472 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.054432 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58cd22d1-6ef7-4719-bf22-8658c400786c-config-data\") pod \"58cd22d1-6ef7-4719-bf22-8658c400786c\" (UID: \"58cd22d1-6ef7-4719-bf22-8658c400786c\") " Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.054659 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58cd22d1-6ef7-4719-bf22-8658c400786c-combined-ca-bundle\") pod \"58cd22d1-6ef7-4719-bf22-8658c400786c\" (UID: \"58cd22d1-6ef7-4719-bf22-8658c400786c\") " Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.054707 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg2pw\" (UniqueName: \"kubernetes.io/projected/58cd22d1-6ef7-4719-bf22-8658c400786c-kube-api-access-cg2pw\") pod \"58cd22d1-6ef7-4719-bf22-8658c400786c\" (UID: \"58cd22d1-6ef7-4719-bf22-8658c400786c\") " Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.060192 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58cd22d1-6ef7-4719-bf22-8658c400786c-kube-api-access-cg2pw" (OuterVolumeSpecName: "kube-api-access-cg2pw") pod "58cd22d1-6ef7-4719-bf22-8658c400786c" (UID: "58cd22d1-6ef7-4719-bf22-8658c400786c"). InnerVolumeSpecName "kube-api-access-cg2pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.090842 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58cd22d1-6ef7-4719-bf22-8658c400786c-config-data" (OuterVolumeSpecName: "config-data") pod "58cd22d1-6ef7-4719-bf22-8658c400786c" (UID: "58cd22d1-6ef7-4719-bf22-8658c400786c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.093717 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58cd22d1-6ef7-4719-bf22-8658c400786c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58cd22d1-6ef7-4719-bf22-8658c400786c" (UID: "58cd22d1-6ef7-4719-bf22-8658c400786c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.156462 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58cd22d1-6ef7-4719-bf22-8658c400786c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.156494 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58cd22d1-6ef7-4719-bf22-8658c400786c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.156521 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg2pw\" (UniqueName: \"kubernetes.io/projected/58cd22d1-6ef7-4719-bf22-8658c400786c-kube-api-access-cg2pw\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.290989 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.376615 4851 generic.go:334] "Generic (PLEG): container finished" podID="58cd22d1-6ef7-4719-bf22-8658c400786c" containerID="7c598c2dc9bd731a9c1659e63f57a189fd4d020c746a568812112b0d92d7a536" exitCode=0 Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.376680 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"58cd22d1-6ef7-4719-bf22-8658c400786c","Type":"ContainerDied","Data":"7c598c2dc9bd731a9c1659e63f57a189fd4d020c746a568812112b0d92d7a536"} Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.376681 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.376706 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"58cd22d1-6ef7-4719-bf22-8658c400786c","Type":"ContainerDied","Data":"5544a49d707273c984c567effd50e7610de68866cd1ebb5f729b9f0a7723d49d"} Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.376723 4851 scope.go:117] "RemoveContainer" containerID="7c598c2dc9bd731a9c1659e63f57a189fd4d020c746a568812112b0d92d7a536" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.392143 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522","Type":"ContainerStarted","Data":"5c817019adf39b1ed92eed5a2a029048dbc35b8f232f5de414bec3b9cb31c4a0"} Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.425601 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.439080 4851 scope.go:117] "RemoveContainer" containerID="7c598c2dc9bd731a9c1659e63f57a189fd4d020c746a568812112b0d92d7a536" Oct 01 13:16:13 crc kubenswrapper[4851]: E1001 13:16:13.439457 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c598c2dc9bd731a9c1659e63f57a189fd4d020c746a568812112b0d92d7a536\": container with ID starting with 7c598c2dc9bd731a9c1659e63f57a189fd4d020c746a568812112b0d92d7a536 not found: ID does not exist" containerID="7c598c2dc9bd731a9c1659e63f57a189fd4d020c746a568812112b0d92d7a536" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.439492 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c598c2dc9bd731a9c1659e63f57a189fd4d020c746a568812112b0d92d7a536"} err="failed to get container status \"7c598c2dc9bd731a9c1659e63f57a189fd4d020c746a568812112b0d92d7a536\": rpc error: code = NotFound desc = could not find container \"7c598c2dc9bd731a9c1659e63f57a189fd4d020c746a568812112b0d92d7a536\": container with ID starting with 7c598c2dc9bd731a9c1659e63f57a189fd4d020c746a568812112b0d92d7a536 not found: ID does not exist" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.442924 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.468751 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:16:13 crc kubenswrapper[4851]: E1001 13:16:13.469144 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cd22d1-6ef7-4719-bf22-8658c400786c" containerName="nova-scheduler-scheduler" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.469159 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cd22d1-6ef7-4719-bf22-8658c400786c" containerName="nova-scheduler-scheduler" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.469368 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="58cd22d1-6ef7-4719-bf22-8658c400786c" containerName="nova-scheduler-scheduler" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.471572 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.482109 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.508712 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.563572 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d829ebf1-e5aa-4c23-9a7e-db128f394557-config-data\") pod \"nova-scheduler-0\" (UID: \"d829ebf1-e5aa-4c23-9a7e-db128f394557\") " pod="openstack/nova-scheduler-0" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.563844 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d829ebf1-e5aa-4c23-9a7e-db128f394557-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d829ebf1-e5aa-4c23-9a7e-db128f394557\") " pod="openstack/nova-scheduler-0" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.564044 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9txcl\" (UniqueName: \"kubernetes.io/projected/d829ebf1-e5aa-4c23-9a7e-db128f394557-kube-api-access-9txcl\") pod \"nova-scheduler-0\" (UID: \"d829ebf1-e5aa-4c23-9a7e-db128f394557\") " pod="openstack/nova-scheduler-0" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.665399 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9txcl\" (UniqueName: \"kubernetes.io/projected/d829ebf1-e5aa-4c23-9a7e-db128f394557-kube-api-access-9txcl\") pod \"nova-scheduler-0\" (UID: \"d829ebf1-e5aa-4c23-9a7e-db128f394557\") " pod="openstack/nova-scheduler-0" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.665516 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d829ebf1-e5aa-4c23-9a7e-db128f394557-config-data\") pod \"nova-scheduler-0\" (UID: \"d829ebf1-e5aa-4c23-9a7e-db128f394557\") " pod="openstack/nova-scheduler-0" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.665585 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d829ebf1-e5aa-4c23-9a7e-db128f394557-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d829ebf1-e5aa-4c23-9a7e-db128f394557\") " pod="openstack/nova-scheduler-0" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.669001 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d829ebf1-e5aa-4c23-9a7e-db128f394557-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d829ebf1-e5aa-4c23-9a7e-db128f394557\") " pod="openstack/nova-scheduler-0" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.670239 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d829ebf1-e5aa-4c23-9a7e-db128f394557-config-data\") pod \"nova-scheduler-0\" (UID: \"d829ebf1-e5aa-4c23-9a7e-db128f394557\") " pod="openstack/nova-scheduler-0" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.682078 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9txcl\" (UniqueName: \"kubernetes.io/projected/d829ebf1-e5aa-4c23-9a7e-db128f394557-kube-api-access-9txcl\") pod \"nova-scheduler-0\" (UID: \"d829ebf1-e5aa-4c23-9a7e-db128f394557\") " pod="openstack/nova-scheduler-0" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.756801 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.975366 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lnxhp"] Oct 01 13:16:13 crc kubenswrapper[4851]: I1001 13:16:13.988200 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.002200 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnxhp"] Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.077596 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fmr8\" (UniqueName: \"kubernetes.io/projected/4853c804-3779-47f0-a588-e11f25ec2857-kube-api-access-7fmr8\") pod \"redhat-marketplace-lnxhp\" (UID: \"4853c804-3779-47f0-a588-e11f25ec2857\") " pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.077661 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4853c804-3779-47f0-a588-e11f25ec2857-catalog-content\") pod \"redhat-marketplace-lnxhp\" (UID: \"4853c804-3779-47f0-a588-e11f25ec2857\") " pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.077748 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4853c804-3779-47f0-a588-e11f25ec2857-utilities\") pod \"redhat-marketplace-lnxhp\" (UID: \"4853c804-3779-47f0-a588-e11f25ec2857\") " pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.180080 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4853c804-3779-47f0-a588-e11f25ec2857-utilities\") pod \"redhat-marketplace-lnxhp\" (UID: \"4853c804-3779-47f0-a588-e11f25ec2857\") " pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.180249 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fmr8\" (UniqueName: \"kubernetes.io/projected/4853c804-3779-47f0-a588-e11f25ec2857-kube-api-access-7fmr8\") pod \"redhat-marketplace-lnxhp\" (UID: \"4853c804-3779-47f0-a588-e11f25ec2857\") " pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.180724 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4853c804-3779-47f0-a588-e11f25ec2857-catalog-content\") pod \"redhat-marketplace-lnxhp\" (UID: \"4853c804-3779-47f0-a588-e11f25ec2857\") " pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.180751 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4853c804-3779-47f0-a588-e11f25ec2857-utilities\") pod \"redhat-marketplace-lnxhp\" (UID: \"4853c804-3779-47f0-a588-e11f25ec2857\") " pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.181135 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4853c804-3779-47f0-a588-e11f25ec2857-catalog-content\") pod \"redhat-marketplace-lnxhp\" (UID: \"4853c804-3779-47f0-a588-e11f25ec2857\") " pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.203142 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fmr8\" (UniqueName: \"kubernetes.io/projected/4853c804-3779-47f0-a588-e11f25ec2857-kube-api-access-7fmr8\") pod \"redhat-marketplace-lnxhp\" (UID: \"4853c804-3779-47f0-a588-e11f25ec2857\") " pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.281828 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.315147 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.341570 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58cd22d1-6ef7-4719-bf22-8658c400786c" path="/var/lib/kubelet/pods/58cd22d1-6ef7-4719-bf22-8658c400786c/volumes" Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.342228 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68393f76-bfd1-40fc-b0e7-01bd567a657e" path="/var/lib/kubelet/pods/68393f76-bfd1-40fc-b0e7-01bd567a657e/volumes" Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.422275 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522","Type":"ContainerStarted","Data":"1f7f2d6941ad62edbf4059a1ce813e8b29618e6dbcbd994e5da9489acd3f2e6e"} Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.423036 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522","Type":"ContainerStarted","Data":"06e4e60b02747cd35bbc2eaaefa742d8d4cbf28230de243d1100137e08986f51"} Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.429737 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d829ebf1-e5aa-4c23-9a7e-db128f394557","Type":"ContainerStarted","Data":"70bef57b8455e6d5467c6cb1a2cf277e9129874197a2bee93bd9924042cd45ec"} Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.821188 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.821165615 podStartE2EDuration="2.821165615s" podCreationTimestamp="2025-10-01 13:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:16:14.452631706 +0000 UTC m=+1382.797749192" watchObservedRunningTime="2025-10-01 13:16:14.821165615 +0000 UTC m=+1383.166283111" Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.829613 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnxhp"] Oct 01 13:16:14 crc kubenswrapper[4851]: I1001 13:16:14.984468 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.104125 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-internal-tls-certs\") pod \"d967224f-1bb3-4493-b3da-75fecbe641f8\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.104735 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-config-data\") pod \"d967224f-1bb3-4493-b3da-75fecbe641f8\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.105386 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgvpt\" (UniqueName: \"kubernetes.io/projected/d967224f-1bb3-4493-b3da-75fecbe641f8-kube-api-access-cgvpt\") pod \"d967224f-1bb3-4493-b3da-75fecbe641f8\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.105563 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-public-tls-certs\") pod \"d967224f-1bb3-4493-b3da-75fecbe641f8\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.105620 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-combined-ca-bundle\") pod \"d967224f-1bb3-4493-b3da-75fecbe641f8\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.106179 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d967224f-1bb3-4493-b3da-75fecbe641f8-logs\") pod \"d967224f-1bb3-4493-b3da-75fecbe641f8\" (UID: \"d967224f-1bb3-4493-b3da-75fecbe641f8\") " Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.106721 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d967224f-1bb3-4493-b3da-75fecbe641f8-logs" (OuterVolumeSpecName: "logs") pod "d967224f-1bb3-4493-b3da-75fecbe641f8" (UID: "d967224f-1bb3-4493-b3da-75fecbe641f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.107109 4851 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d967224f-1bb3-4493-b3da-75fecbe641f8-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.128767 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d967224f-1bb3-4493-b3da-75fecbe641f8-kube-api-access-cgvpt" (OuterVolumeSpecName: "kube-api-access-cgvpt") pod "d967224f-1bb3-4493-b3da-75fecbe641f8" (UID: "d967224f-1bb3-4493-b3da-75fecbe641f8"). InnerVolumeSpecName "kube-api-access-cgvpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.148713 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-config-data" (OuterVolumeSpecName: "config-data") pod "d967224f-1bb3-4493-b3da-75fecbe641f8" (UID: "d967224f-1bb3-4493-b3da-75fecbe641f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.158262 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d967224f-1bb3-4493-b3da-75fecbe641f8" (UID: "d967224f-1bb3-4493-b3da-75fecbe641f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.171586 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d967224f-1bb3-4493-b3da-75fecbe641f8" (UID: "d967224f-1bb3-4493-b3da-75fecbe641f8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.172066 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d967224f-1bb3-4493-b3da-75fecbe641f8" (UID: "d967224f-1bb3-4493-b3da-75fecbe641f8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.209275 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.209313 4851 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.209326 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.209337 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgvpt\" (UniqueName: \"kubernetes.io/projected/d967224f-1bb3-4493-b3da-75fecbe641f8-kube-api-access-cgvpt\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.209351 4851 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d967224f-1bb3-4493-b3da-75fecbe641f8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.441268 4851 generic.go:334] "Generic (PLEG): container finished" podID="4853c804-3779-47f0-a588-e11f25ec2857" containerID="43a7606c011d8cfe948dfe4cda2f868071960e75c2e5401c5425f5700c8b6910" exitCode=0 Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.441346 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnxhp" event={"ID":"4853c804-3779-47f0-a588-e11f25ec2857","Type":"ContainerDied","Data":"43a7606c011d8cfe948dfe4cda2f868071960e75c2e5401c5425f5700c8b6910"} Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.441372 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnxhp" event={"ID":"4853c804-3779-47f0-a588-e11f25ec2857","Type":"ContainerStarted","Data":"c68e3b3bd56e960bcecdad82e6bc911abbaefcf655e38714a1a328adf8a9ac9c"} Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.445390 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d829ebf1-e5aa-4c23-9a7e-db128f394557","Type":"ContainerStarted","Data":"0068fcc642576e02db3dd98d4fe043659830371066218d826f405f13086530ce"} Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.452748 4851 generic.go:334] "Generic (PLEG): container finished" podID="d967224f-1bb3-4493-b3da-75fecbe641f8" containerID="c5f3dfc42f20b40da1e581e4382d8106b1cdc183d51ce1411afbb0278d1ee60d" exitCode=0 Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.452803 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.452843 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d967224f-1bb3-4493-b3da-75fecbe641f8","Type":"ContainerDied","Data":"c5f3dfc42f20b40da1e581e4382d8106b1cdc183d51ce1411afbb0278d1ee60d"} Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.452875 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d967224f-1bb3-4493-b3da-75fecbe641f8","Type":"ContainerDied","Data":"eaed1e96ac968154cccfbd9eaefc17239b205532eedeff3687ee5177ae675aec"} Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.452916 4851 scope.go:117] "RemoveContainer" containerID="c5f3dfc42f20b40da1e581e4382d8106b1cdc183d51ce1411afbb0278d1ee60d" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.487817 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.487797828 podStartE2EDuration="2.487797828s" podCreationTimestamp="2025-10-01 13:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:16:15.480355195 +0000 UTC m=+1383.825472681" watchObservedRunningTime="2025-10-01 13:16:15.487797828 +0000 UTC m=+1383.832915334" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.566007 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.574911 4851 scope.go:117] "RemoveContainer" containerID="e8f76d934c91c7e493ae3b5570bfc68d3ad4fe285237022e2db46b03651e181f" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.580781 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.594295 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 13:16:15 crc kubenswrapper[4851]: E1001 13:16:15.594760 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d967224f-1bb3-4493-b3da-75fecbe641f8" containerName="nova-api-log" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.594773 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="d967224f-1bb3-4493-b3da-75fecbe641f8" containerName="nova-api-log" Oct 01 13:16:15 crc kubenswrapper[4851]: E1001 13:16:15.594795 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d967224f-1bb3-4493-b3da-75fecbe641f8" containerName="nova-api-api" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.594801 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="d967224f-1bb3-4493-b3da-75fecbe641f8" containerName="nova-api-api" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.595003 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="d967224f-1bb3-4493-b3da-75fecbe641f8" containerName="nova-api-api" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.595020 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="d967224f-1bb3-4493-b3da-75fecbe641f8" containerName="nova-api-log" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.596077 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.603807 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.609313 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.611385 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.611636 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.633792 4851 scope.go:117] "RemoveContainer" containerID="c5f3dfc42f20b40da1e581e4382d8106b1cdc183d51ce1411afbb0278d1ee60d" Oct 01 13:16:15 crc kubenswrapper[4851]: E1001 13:16:15.634615 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5f3dfc42f20b40da1e581e4382d8106b1cdc183d51ce1411afbb0278d1ee60d\": container with ID starting with c5f3dfc42f20b40da1e581e4382d8106b1cdc183d51ce1411afbb0278d1ee60d not found: ID does not exist" containerID="c5f3dfc42f20b40da1e581e4382d8106b1cdc183d51ce1411afbb0278d1ee60d" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.634649 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5f3dfc42f20b40da1e581e4382d8106b1cdc183d51ce1411afbb0278d1ee60d"} err="failed to get container status \"c5f3dfc42f20b40da1e581e4382d8106b1cdc183d51ce1411afbb0278d1ee60d\": rpc error: code = NotFound desc = could not find container \"c5f3dfc42f20b40da1e581e4382d8106b1cdc183d51ce1411afbb0278d1ee60d\": container with ID starting with c5f3dfc42f20b40da1e581e4382d8106b1cdc183d51ce1411afbb0278d1ee60d not found: ID does not exist" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.634669 4851 scope.go:117] "RemoveContainer" containerID="e8f76d934c91c7e493ae3b5570bfc68d3ad4fe285237022e2db46b03651e181f" Oct 01 13:16:15 crc kubenswrapper[4851]: E1001 13:16:15.636069 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8f76d934c91c7e493ae3b5570bfc68d3ad4fe285237022e2db46b03651e181f\": container with ID starting with e8f76d934c91c7e493ae3b5570bfc68d3ad4fe285237022e2db46b03651e181f not found: ID does not exist" containerID="e8f76d934c91c7e493ae3b5570bfc68d3ad4fe285237022e2db46b03651e181f" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.636114 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8f76d934c91c7e493ae3b5570bfc68d3ad4fe285237022e2db46b03651e181f"} err="failed to get container status \"e8f76d934c91c7e493ae3b5570bfc68d3ad4fe285237022e2db46b03651e181f\": rpc error: code = NotFound desc = could not find container \"e8f76d934c91c7e493ae3b5570bfc68d3ad4fe285237022e2db46b03651e181f\": container with ID starting with e8f76d934c91c7e493ae3b5570bfc68d3ad4fe285237022e2db46b03651e181f not found: ID does not exist" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.719977 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88f8fcd9-1105-4592-a420-98ea9033c3d9-config-data\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.720117 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f8fcd9-1105-4592-a420-98ea9033c3d9-public-tls-certs\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.720174 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f8fcd9-1105-4592-a420-98ea9033c3d9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.720224 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88f8fcd9-1105-4592-a420-98ea9033c3d9-logs\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.720300 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88f8fcd9-1105-4592-a420-98ea9033c3d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.720345 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gb4x\" (UniqueName: \"kubernetes.io/projected/88f8fcd9-1105-4592-a420-98ea9033c3d9-kube-api-access-4gb4x\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.822677 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88f8fcd9-1105-4592-a420-98ea9033c3d9-config-data\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.822887 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f8fcd9-1105-4592-a420-98ea9033c3d9-public-tls-certs\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.822950 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f8fcd9-1105-4592-a420-98ea9033c3d9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.823043 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88f8fcd9-1105-4592-a420-98ea9033c3d9-logs\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.823816 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88f8fcd9-1105-4592-a420-98ea9033c3d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.823887 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gb4x\" (UniqueName: \"kubernetes.io/projected/88f8fcd9-1105-4592-a420-98ea9033c3d9-kube-api-access-4gb4x\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.824998 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88f8fcd9-1105-4592-a420-98ea9033c3d9-logs\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.827350 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f8fcd9-1105-4592-a420-98ea9033c3d9-public-tls-certs\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.830992 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88f8fcd9-1105-4592-a420-98ea9033c3d9-config-data\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.831147 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f8fcd9-1105-4592-a420-98ea9033c3d9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.838611 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88f8fcd9-1105-4592-a420-98ea9033c3d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.846163 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gb4x\" (UniqueName: \"kubernetes.io/projected/88f8fcd9-1105-4592-a420-98ea9033c3d9-kube-api-access-4gb4x\") pod \"nova-api-0\" (UID: \"88f8fcd9-1105-4592-a420-98ea9033c3d9\") " pod="openstack/nova-api-0" Oct 01 13:16:15 crc kubenswrapper[4851]: I1001 13:16:15.928251 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:16:16 crc kubenswrapper[4851]: I1001 13:16:16.339249 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d967224f-1bb3-4493-b3da-75fecbe641f8" path="/var/lib/kubelet/pods/d967224f-1bb3-4493-b3da-75fecbe641f8/volumes" Oct 01 13:16:16 crc kubenswrapper[4851]: I1001 13:16:16.412617 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:16:16 crc kubenswrapper[4851]: W1001 13:16:16.417457 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88f8fcd9_1105_4592_a420_98ea9033c3d9.slice/crio-4d97a92d8d5b815cf55076fefb100d605781d133b050b2858a0fc3f73caa090d WatchSource:0}: Error finding container 4d97a92d8d5b815cf55076fefb100d605781d133b050b2858a0fc3f73caa090d: Status 404 returned error can't find the container with id 4d97a92d8d5b815cf55076fefb100d605781d133b050b2858a0fc3f73caa090d Oct 01 13:16:16 crc kubenswrapper[4851]: I1001 13:16:16.477901 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnxhp" event={"ID":"4853c804-3779-47f0-a588-e11f25ec2857","Type":"ContainerStarted","Data":"9794756873b23b6a65d38f842738dda724d8408dae4a922f06a458c9a9936809"} Oct 01 13:16:16 crc kubenswrapper[4851]: I1001 13:16:16.481979 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88f8fcd9-1105-4592-a420-98ea9033c3d9","Type":"ContainerStarted","Data":"4d97a92d8d5b815cf55076fefb100d605781d133b050b2858a0fc3f73caa090d"} Oct 01 13:16:17 crc kubenswrapper[4851]: I1001 13:16:17.521181 4851 generic.go:334] "Generic (PLEG): container finished" podID="4853c804-3779-47f0-a588-e11f25ec2857" containerID="9794756873b23b6a65d38f842738dda724d8408dae4a922f06a458c9a9936809" exitCode=0 Oct 01 13:16:17 crc kubenswrapper[4851]: I1001 13:16:17.521317 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnxhp" event={"ID":"4853c804-3779-47f0-a588-e11f25ec2857","Type":"ContainerDied","Data":"9794756873b23b6a65d38f842738dda724d8408dae4a922f06a458c9a9936809"} Oct 01 13:16:17 crc kubenswrapper[4851]: I1001 13:16:17.526071 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88f8fcd9-1105-4592-a420-98ea9033c3d9","Type":"ContainerStarted","Data":"019c0b057da056a80ee083d5c8e81bcc979649e6360071a8cf8a2313221c41e3"} Oct 01 13:16:17 crc kubenswrapper[4851]: I1001 13:16:17.526368 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88f8fcd9-1105-4592-a420-98ea9033c3d9","Type":"ContainerStarted","Data":"3367c27296314b374dfd2e4683ac6346055435c13afd3e7c5929c2cb124914aa"} Oct 01 13:16:17 crc kubenswrapper[4851]: I1001 13:16:17.596350 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.59632313 podStartE2EDuration="2.59632313s" podCreationTimestamp="2025-10-01 13:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:16:17.572724328 +0000 UTC m=+1385.917841854" watchObservedRunningTime="2025-10-01 13:16:17.59632313 +0000 UTC m=+1385.941440646" Oct 01 13:16:17 crc kubenswrapper[4851]: I1001 13:16:17.840447 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 13:16:17 crc kubenswrapper[4851]: I1001 13:16:17.840559 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 13:16:18 crc kubenswrapper[4851]: I1001 13:16:18.561306 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnxhp" event={"ID":"4853c804-3779-47f0-a588-e11f25ec2857","Type":"ContainerStarted","Data":"f9725d2f0479992e01a102171d78a74fee3760b65b6118f90337171fbe061766"} Oct 01 13:16:18 crc kubenswrapper[4851]: I1001 13:16:18.589806 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lnxhp" podStartSLOduration=3.072307334 podStartE2EDuration="5.589779617s" podCreationTimestamp="2025-10-01 13:16:13 +0000 UTC" firstStartedPulling="2025-10-01 13:16:15.445707432 +0000 UTC m=+1383.790824928" lastFinishedPulling="2025-10-01 13:16:17.963179685 +0000 UTC m=+1386.308297211" observedRunningTime="2025-10-01 13:16:18.586233351 +0000 UTC m=+1386.931350837" watchObservedRunningTime="2025-10-01 13:16:18.589779617 +0000 UTC m=+1386.934897143" Oct 01 13:16:18 crc kubenswrapper[4851]: I1001 13:16:18.757643 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 13:16:22 crc kubenswrapper[4851]: I1001 13:16:22.840723 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 13:16:22 crc kubenswrapper[4851]: I1001 13:16:22.841162 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 13:16:23 crc kubenswrapper[4851]: I1001 13:16:23.757840 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 13:16:23 crc kubenswrapper[4851]: I1001 13:16:23.818193 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 13:16:23 crc kubenswrapper[4851]: I1001 13:16:23.858822 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.222:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:16:23 crc kubenswrapper[4851]: I1001 13:16:23.858935 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.222:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:16:24 crc kubenswrapper[4851]: I1001 13:16:24.315908 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:24 crc kubenswrapper[4851]: I1001 13:16:24.315982 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:24 crc kubenswrapper[4851]: I1001 13:16:24.387240 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:24 crc kubenswrapper[4851]: I1001 13:16:24.692409 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 13:16:24 crc kubenswrapper[4851]: I1001 13:16:24.735609 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:24 crc kubenswrapper[4851]: I1001 13:16:24.820777 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnxhp"] Oct 01 13:16:25 crc kubenswrapper[4851]: I1001 13:16:25.928992 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:16:25 crc kubenswrapper[4851]: I1001 13:16:25.929264 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:16:26 crc kubenswrapper[4851]: I1001 13:16:26.660665 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lnxhp" podUID="4853c804-3779-47f0-a588-e11f25ec2857" containerName="registry-server" containerID="cri-o://f9725d2f0479992e01a102171d78a74fee3760b65b6118f90337171fbe061766" gracePeriod=2 Oct 01 13:16:26 crc kubenswrapper[4851]: I1001 13:16:26.939615 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="88f8fcd9-1105-4592-a420-98ea9033c3d9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.225:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:16:26 crc kubenswrapper[4851]: I1001 13:16:26.939632 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="88f8fcd9-1105-4592-a420-98ea9033c3d9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.225:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:16:27 crc kubenswrapper[4851]: I1001 13:16:27.674273 4851 generic.go:334] "Generic (PLEG): container finished" podID="4853c804-3779-47f0-a588-e11f25ec2857" containerID="f9725d2f0479992e01a102171d78a74fee3760b65b6118f90337171fbe061766" exitCode=0 Oct 01 13:16:27 crc kubenswrapper[4851]: I1001 13:16:27.674364 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnxhp" event={"ID":"4853c804-3779-47f0-a588-e11f25ec2857","Type":"ContainerDied","Data":"f9725d2f0479992e01a102171d78a74fee3760b65b6118f90337171fbe061766"} Oct 01 13:16:27 crc kubenswrapper[4851]: I1001 13:16:27.786093 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:27 crc kubenswrapper[4851]: I1001 13:16:27.889683 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4853c804-3779-47f0-a588-e11f25ec2857-catalog-content\") pod \"4853c804-3779-47f0-a588-e11f25ec2857\" (UID: \"4853c804-3779-47f0-a588-e11f25ec2857\") " Oct 01 13:16:27 crc kubenswrapper[4851]: I1001 13:16:27.889828 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4853c804-3779-47f0-a588-e11f25ec2857-utilities\") pod \"4853c804-3779-47f0-a588-e11f25ec2857\" (UID: \"4853c804-3779-47f0-a588-e11f25ec2857\") " Oct 01 13:16:27 crc kubenswrapper[4851]: I1001 13:16:27.889995 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fmr8\" (UniqueName: \"kubernetes.io/projected/4853c804-3779-47f0-a588-e11f25ec2857-kube-api-access-7fmr8\") pod \"4853c804-3779-47f0-a588-e11f25ec2857\" (UID: \"4853c804-3779-47f0-a588-e11f25ec2857\") " Oct 01 13:16:27 crc kubenswrapper[4851]: I1001 13:16:27.890556 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4853c804-3779-47f0-a588-e11f25ec2857-utilities" (OuterVolumeSpecName: "utilities") pod "4853c804-3779-47f0-a588-e11f25ec2857" (UID: "4853c804-3779-47f0-a588-e11f25ec2857"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:16:27 crc kubenswrapper[4851]: I1001 13:16:27.896904 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4853c804-3779-47f0-a588-e11f25ec2857-kube-api-access-7fmr8" (OuterVolumeSpecName: "kube-api-access-7fmr8") pod "4853c804-3779-47f0-a588-e11f25ec2857" (UID: "4853c804-3779-47f0-a588-e11f25ec2857"). InnerVolumeSpecName "kube-api-access-7fmr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:27 crc kubenswrapper[4851]: I1001 13:16:27.902811 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4853c804-3779-47f0-a588-e11f25ec2857-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4853c804-3779-47f0-a588-e11f25ec2857" (UID: "4853c804-3779-47f0-a588-e11f25ec2857"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:16:27 crc kubenswrapper[4851]: I1001 13:16:27.992215 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4853c804-3779-47f0-a588-e11f25ec2857-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:27 crc kubenswrapper[4851]: I1001 13:16:27.992257 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4853c804-3779-47f0-a588-e11f25ec2857-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:27 crc kubenswrapper[4851]: I1001 13:16:27.992271 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fmr8\" (UniqueName: \"kubernetes.io/projected/4853c804-3779-47f0-a588-e11f25ec2857-kube-api-access-7fmr8\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:28 crc kubenswrapper[4851]: I1001 13:16:28.687480 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnxhp" event={"ID":"4853c804-3779-47f0-a588-e11f25ec2857","Type":"ContainerDied","Data":"c68e3b3bd56e960bcecdad82e6bc911abbaefcf655e38714a1a328adf8a9ac9c"} Oct 01 13:16:28 crc kubenswrapper[4851]: I1001 13:16:28.687571 4851 scope.go:117] "RemoveContainer" containerID="f9725d2f0479992e01a102171d78a74fee3760b65b6118f90337171fbe061766" Oct 01 13:16:28 crc kubenswrapper[4851]: I1001 13:16:28.687582 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnxhp" Oct 01 13:16:28 crc kubenswrapper[4851]: I1001 13:16:28.722168 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnxhp"] Oct 01 13:16:28 crc kubenswrapper[4851]: I1001 13:16:28.723724 4851 scope.go:117] "RemoveContainer" containerID="9794756873b23b6a65d38f842738dda724d8408dae4a922f06a458c9a9936809" Oct 01 13:16:28 crc kubenswrapper[4851]: I1001 13:16:28.736763 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnxhp"] Oct 01 13:16:28 crc kubenswrapper[4851]: I1001 13:16:28.754295 4851 scope.go:117] "RemoveContainer" containerID="43a7606c011d8cfe948dfe4cda2f868071960e75c2e5401c5425f5700c8b6910" Oct 01 13:16:30 crc kubenswrapper[4851]: I1001 13:16:30.050354 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:16:30 crc kubenswrapper[4851]: I1001 13:16:30.050424 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:16:30 crc kubenswrapper[4851]: I1001 13:16:30.050481 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 13:16:30 crc kubenswrapper[4851]: I1001 13:16:30.051145 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45376e593b7f479231d2d5e58c334337acd5d47c4a95ca6e3b37f5047096d591"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:16:30 crc kubenswrapper[4851]: I1001 13:16:30.051217 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://45376e593b7f479231d2d5e58c334337acd5d47c4a95ca6e3b37f5047096d591" gracePeriod=600 Oct 01 13:16:30 crc kubenswrapper[4851]: I1001 13:16:30.338258 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4853c804-3779-47f0-a588-e11f25ec2857" path="/var/lib/kubelet/pods/4853c804-3779-47f0-a588-e11f25ec2857/volumes" Oct 01 13:16:30 crc kubenswrapper[4851]: I1001 13:16:30.722403 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="45376e593b7f479231d2d5e58c334337acd5d47c4a95ca6e3b37f5047096d591" exitCode=0 Oct 01 13:16:30 crc kubenswrapper[4851]: I1001 13:16:30.722445 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"45376e593b7f479231d2d5e58c334337acd5d47c4a95ca6e3b37f5047096d591"} Oct 01 13:16:30 crc kubenswrapper[4851]: I1001 13:16:30.722470 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7"} Oct 01 13:16:30 crc kubenswrapper[4851]: I1001 13:16:30.722486 4851 scope.go:117] "RemoveContainer" containerID="91826d492b7e20fab5770efae0e5337ce96401b4dbf9f9356c89538e943aab30" Oct 01 13:16:32 crc kubenswrapper[4851]: I1001 13:16:32.848978 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 13:16:32 crc kubenswrapper[4851]: I1001 13:16:32.849676 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 13:16:32 crc kubenswrapper[4851]: I1001 13:16:32.865626 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 13:16:32 crc kubenswrapper[4851]: I1001 13:16:32.866679 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 13:16:33 crc kubenswrapper[4851]: I1001 13:16:33.933756 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 13:16:35 crc kubenswrapper[4851]: I1001 13:16:35.945556 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 13:16:35 crc kubenswrapper[4851]: I1001 13:16:35.947842 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 13:16:35 crc kubenswrapper[4851]: I1001 13:16:35.957075 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 13:16:35 crc kubenswrapper[4851]: I1001 13:16:35.961584 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 13:16:36 crc kubenswrapper[4851]: I1001 13:16:36.789347 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 13:16:36 crc kubenswrapper[4851]: I1001 13:16:36.807012 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 13:16:42 crc kubenswrapper[4851]: I1001 13:16:42.396497 4851 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod68393f76-bfd1-40fc-b0e7-01bd567a657e"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod68393f76-bfd1-40fc-b0e7-01bd567a657e] : Timed out while waiting for systemd to remove kubepods-besteffort-pod68393f76_bfd1_40fc_b0e7_01bd567a657e.slice" Oct 01 13:16:45 crc kubenswrapper[4851]: I1001 13:16:45.377595 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:16:46 crc kubenswrapper[4851]: I1001 13:16:46.769957 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:16:48 crc kubenswrapper[4851]: I1001 13:16:48.676607 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="712c3704-a775-4fac-81d6-9aa9cfdc48ef" containerName="rabbitmq" containerID="cri-o://ed5aa2aa3d0499ef8bdca55947cbc7697b854f3673f0b90ed8759cfa5950d638" gracePeriod=604797 Oct 01 13:16:49 crc kubenswrapper[4851]: I1001 13:16:49.927314 4851 generic.go:334] "Generic (PLEG): container finished" podID="712c3704-a775-4fac-81d6-9aa9cfdc48ef" containerID="ed5aa2aa3d0499ef8bdca55947cbc7697b854f3673f0b90ed8759cfa5950d638" exitCode=0 Oct 01 13:16:49 crc kubenswrapper[4851]: I1001 13:16:49.927652 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"712c3704-a775-4fac-81d6-9aa9cfdc48ef","Type":"ContainerDied","Data":"ed5aa2aa3d0499ef8bdca55947cbc7697b854f3673f0b90ed8759cfa5950d638"} Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.006580 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="95c53639-696f-4d10-a297-7173dd3b394f" containerName="rabbitmq" containerID="cri-o://484a61f0f7ac368499b822805fd5cb3b85d684289d2657f90c7f00476f9f1707" gracePeriod=604797 Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.354351 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.493281 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/712c3704-a775-4fac-81d6-9aa9cfdc48ef-pod-info\") pod \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.493362 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.493424 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-server-conf\") pod \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.493470 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-config-data\") pod \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.493489 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-confd\") pod \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.493564 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-tls\") pod \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.493640 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9x7g\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-kube-api-access-p9x7g\") pod \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.493675 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/712c3704-a775-4fac-81d6-9aa9cfdc48ef-erlang-cookie-secret\") pod \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.493701 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-erlang-cookie\") pod \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.493776 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-plugins\") pod \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.493833 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-plugins-conf\") pod \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\" (UID: \"712c3704-a775-4fac-81d6-9aa9cfdc48ef\") " Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.494247 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "712c3704-a775-4fac-81d6-9aa9cfdc48ef" (UID: "712c3704-a775-4fac-81d6-9aa9cfdc48ef"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.494278 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "712c3704-a775-4fac-81d6-9aa9cfdc48ef" (UID: "712c3704-a775-4fac-81d6-9aa9cfdc48ef"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.494846 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "712c3704-a775-4fac-81d6-9aa9cfdc48ef" (UID: "712c3704-a775-4fac-81d6-9aa9cfdc48ef"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.500564 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-kube-api-access-p9x7g" (OuterVolumeSpecName: "kube-api-access-p9x7g") pod "712c3704-a775-4fac-81d6-9aa9cfdc48ef" (UID: "712c3704-a775-4fac-81d6-9aa9cfdc48ef"). InnerVolumeSpecName "kube-api-access-p9x7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.501256 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "712c3704-a775-4fac-81d6-9aa9cfdc48ef" (UID: "712c3704-a775-4fac-81d6-9aa9cfdc48ef"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.501267 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/712c3704-a775-4fac-81d6-9aa9cfdc48ef-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "712c3704-a775-4fac-81d6-9aa9cfdc48ef" (UID: "712c3704-a775-4fac-81d6-9aa9cfdc48ef"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.503616 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "712c3704-a775-4fac-81d6-9aa9cfdc48ef" (UID: "712c3704-a775-4fac-81d6-9aa9cfdc48ef"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.506737 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/712c3704-a775-4fac-81d6-9aa9cfdc48ef-pod-info" (OuterVolumeSpecName: "pod-info") pod "712c3704-a775-4fac-81d6-9aa9cfdc48ef" (UID: "712c3704-a775-4fac-81d6-9aa9cfdc48ef"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.524126 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-config-data" (OuterVolumeSpecName: "config-data") pod "712c3704-a775-4fac-81d6-9aa9cfdc48ef" (UID: "712c3704-a775-4fac-81d6-9aa9cfdc48ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.573152 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-server-conf" (OuterVolumeSpecName: "server-conf") pod "712c3704-a775-4fac-81d6-9aa9cfdc48ef" (UID: "712c3704-a775-4fac-81d6-9aa9cfdc48ef"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.595619 4851 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.595652 4851 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.595661 4851 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/712c3704-a775-4fac-81d6-9aa9cfdc48ef-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.595703 4851 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.595713 4851 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.595720 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/712c3704-a775-4fac-81d6-9aa9cfdc48ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.595729 4851 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.595737 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9x7g\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-kube-api-access-p9x7g\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.595748 4851 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/712c3704-a775-4fac-81d6-9aa9cfdc48ef-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.595756 4851 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.639360 4851 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.669984 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "712c3704-a775-4fac-81d6-9aa9cfdc48ef" (UID: "712c3704-a775-4fac-81d6-9aa9cfdc48ef"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.696895 4851 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.696928 4851 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/712c3704-a775-4fac-81d6-9aa9cfdc48ef-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.937363 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"712c3704-a775-4fac-81d6-9aa9cfdc48ef","Type":"ContainerDied","Data":"8214e828bf9c3d248a79ae102512b98df8b697071f7e04edd37199923d3f58dd"} Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.937673 4851 scope.go:117] "RemoveContainer" containerID="ed5aa2aa3d0499ef8bdca55947cbc7697b854f3673f0b90ed8759cfa5950d638" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.937807 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.991013 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:16:50 crc kubenswrapper[4851]: I1001 13:16:50.994184 4851 scope.go:117] "RemoveContainer" containerID="413d60ea21361b7065a174a88bcb5289e9879147b33e039ab6917e97367daef4" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.000251 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.034563 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:16:51 crc kubenswrapper[4851]: E1001 13:16:51.035072 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4853c804-3779-47f0-a588-e11f25ec2857" containerName="extract-content" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.035096 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="4853c804-3779-47f0-a588-e11f25ec2857" containerName="extract-content" Oct 01 13:16:51 crc kubenswrapper[4851]: E1001 13:16:51.035126 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712c3704-a775-4fac-81d6-9aa9cfdc48ef" containerName="rabbitmq" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.035135 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="712c3704-a775-4fac-81d6-9aa9cfdc48ef" containerName="rabbitmq" Oct 01 13:16:51 crc kubenswrapper[4851]: E1001 13:16:51.035146 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4853c804-3779-47f0-a588-e11f25ec2857" containerName="extract-utilities" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.035154 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="4853c804-3779-47f0-a588-e11f25ec2857" containerName="extract-utilities" Oct 01 13:16:51 crc kubenswrapper[4851]: E1001 13:16:51.035174 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4853c804-3779-47f0-a588-e11f25ec2857" containerName="registry-server" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.035183 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="4853c804-3779-47f0-a588-e11f25ec2857" containerName="registry-server" Oct 01 13:16:51 crc kubenswrapper[4851]: E1001 13:16:51.035206 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712c3704-a775-4fac-81d6-9aa9cfdc48ef" containerName="setup-container" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.035215 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="712c3704-a775-4fac-81d6-9aa9cfdc48ef" containerName="setup-container" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.035466 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="4853c804-3779-47f0-a588-e11f25ec2857" containerName="registry-server" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.035498 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="712c3704-a775-4fac-81d6-9aa9cfdc48ef" containerName="rabbitmq" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.037062 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.039677 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rx8zc" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.039887 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.040612 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.041050 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.041258 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.041559 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.041735 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.047401 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.205615 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-config-data\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.205845 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.205900 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.205971 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.206098 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.206165 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.206476 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.206617 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szdnj\" (UniqueName: \"kubernetes.io/projected/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-kube-api-access-szdnj\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.206676 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.206761 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.206799 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.308975 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.309023 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.309052 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.309077 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.309097 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.309146 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.309171 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szdnj\" (UniqueName: \"kubernetes.io/projected/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-kube-api-access-szdnj\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.309195 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.309220 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.309239 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.309279 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-config-data\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.310200 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-config-data\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.310472 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.310831 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.312325 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.312791 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.315931 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.315954 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.316249 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.321877 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.324607 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.332572 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szdnj\" (UniqueName: \"kubernetes.io/projected/73f3b1c2-f1c0-47b8-bf31-4d2a185c852e-kube-api-access-szdnj\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.359030 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e\") " pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.379058 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.604652 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.717861 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-tls\") pod \"95c53639-696f-4d10-a297-7173dd3b394f\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.718724 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgmt5\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-kube-api-access-mgmt5\") pod \"95c53639-696f-4d10-a297-7173dd3b394f\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.718761 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-plugins\") pod \"95c53639-696f-4d10-a297-7173dd3b394f\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.718780 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-confd\") pod \"95c53639-696f-4d10-a297-7173dd3b394f\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.718812 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-erlang-cookie\") pod \"95c53639-696f-4d10-a297-7173dd3b394f\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.718855 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"95c53639-696f-4d10-a297-7173dd3b394f\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.718945 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-config-data\") pod \"95c53639-696f-4d10-a297-7173dd3b394f\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.718994 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-plugins-conf\") pod \"95c53639-696f-4d10-a297-7173dd3b394f\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.719061 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/95c53639-696f-4d10-a297-7173dd3b394f-erlang-cookie-secret\") pod \"95c53639-696f-4d10-a297-7173dd3b394f\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.719083 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-server-conf\") pod \"95c53639-696f-4d10-a297-7173dd3b394f\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.719142 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/95c53639-696f-4d10-a297-7173dd3b394f-pod-info\") pod \"95c53639-696f-4d10-a297-7173dd3b394f\" (UID: \"95c53639-696f-4d10-a297-7173dd3b394f\") " Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.719466 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "95c53639-696f-4d10-a297-7173dd3b394f" (UID: "95c53639-696f-4d10-a297-7173dd3b394f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.719545 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "95c53639-696f-4d10-a297-7173dd3b394f" (UID: "95c53639-696f-4d10-a297-7173dd3b394f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.719692 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "95c53639-696f-4d10-a297-7173dd3b394f" (UID: "95c53639-696f-4d10-a297-7173dd3b394f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.719972 4851 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.719987 4851 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.719998 4851 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.725395 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/95c53639-696f-4d10-a297-7173dd3b394f-pod-info" (OuterVolumeSpecName: "pod-info") pod "95c53639-696f-4d10-a297-7173dd3b394f" (UID: "95c53639-696f-4d10-a297-7173dd3b394f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.725896 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "95c53639-696f-4d10-a297-7173dd3b394f" (UID: "95c53639-696f-4d10-a297-7173dd3b394f"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.727117 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c53639-696f-4d10-a297-7173dd3b394f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "95c53639-696f-4d10-a297-7173dd3b394f" (UID: "95c53639-696f-4d10-a297-7173dd3b394f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.727128 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-kube-api-access-mgmt5" (OuterVolumeSpecName: "kube-api-access-mgmt5") pod "95c53639-696f-4d10-a297-7173dd3b394f" (UID: "95c53639-696f-4d10-a297-7173dd3b394f"). InnerVolumeSpecName "kube-api-access-mgmt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.729062 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "95c53639-696f-4d10-a297-7173dd3b394f" (UID: "95c53639-696f-4d10-a297-7173dd3b394f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.755253 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-config-data" (OuterVolumeSpecName: "config-data") pod "95c53639-696f-4d10-a297-7173dd3b394f" (UID: "95c53639-696f-4d10-a297-7173dd3b394f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.780320 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-server-conf" (OuterVolumeSpecName: "server-conf") pod "95c53639-696f-4d10-a297-7173dd3b394f" (UID: "95c53639-696f-4d10-a297-7173dd3b394f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.822169 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.822231 4851 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/95c53639-696f-4d10-a297-7173dd3b394f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.822245 4851 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/95c53639-696f-4d10-a297-7173dd3b394f-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.822257 4851 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/95c53639-696f-4d10-a297-7173dd3b394f-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.822267 4851 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.822297 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgmt5\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-kube-api-access-mgmt5\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.822333 4851 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.847775 4851 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.880550 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "95c53639-696f-4d10-a297-7173dd3b394f" (UID: "95c53639-696f-4d10-a297-7173dd3b394f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.918119 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.924078 4851 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/95c53639-696f-4d10-a297-7173dd3b394f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.924108 4851 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.949882 4851 generic.go:334] "Generic (PLEG): container finished" podID="95c53639-696f-4d10-a297-7173dd3b394f" containerID="484a61f0f7ac368499b822805fd5cb3b85d684289d2657f90c7f00476f9f1707" exitCode=0 Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.949950 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"95c53639-696f-4d10-a297-7173dd3b394f","Type":"ContainerDied","Data":"484a61f0f7ac368499b822805fd5cb3b85d684289d2657f90c7f00476f9f1707"} Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.949958 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.949980 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"95c53639-696f-4d10-a297-7173dd3b394f","Type":"ContainerDied","Data":"c2a1ac02edab40d1ace222280655edd43af85e0df391ea83b621a64b5849bc71"} Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.949999 4851 scope.go:117] "RemoveContainer" containerID="484a61f0f7ac368499b822805fd5cb3b85d684289d2657f90c7f00476f9f1707" Oct 01 13:16:51 crc kubenswrapper[4851]: I1001 13:16:51.953522 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e","Type":"ContainerStarted","Data":"e4e1eb14752deeaef05e15c2ebe1c0983641bb4c2a475f37df3d2cb89254515e"} Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.072180 4851 scope.go:117] "RemoveContainer" containerID="83ea666e82ffc032b51fb419fe2f3e2cbf43bf29cd27389317204ca318397904" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.101537 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.103663 4851 scope.go:117] "RemoveContainer" containerID="484a61f0f7ac368499b822805fd5cb3b85d684289d2657f90c7f00476f9f1707" Oct 01 13:16:52 crc kubenswrapper[4851]: E1001 13:16:52.104169 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"484a61f0f7ac368499b822805fd5cb3b85d684289d2657f90c7f00476f9f1707\": container with ID starting with 484a61f0f7ac368499b822805fd5cb3b85d684289d2657f90c7f00476f9f1707 not found: ID does not exist" containerID="484a61f0f7ac368499b822805fd5cb3b85d684289d2657f90c7f00476f9f1707" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.104204 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484a61f0f7ac368499b822805fd5cb3b85d684289d2657f90c7f00476f9f1707"} err="failed to get container status \"484a61f0f7ac368499b822805fd5cb3b85d684289d2657f90c7f00476f9f1707\": rpc error: code = NotFound desc = could not find container \"484a61f0f7ac368499b822805fd5cb3b85d684289d2657f90c7f00476f9f1707\": container with ID starting with 484a61f0f7ac368499b822805fd5cb3b85d684289d2657f90c7f00476f9f1707 not found: ID does not exist" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.104241 4851 scope.go:117] "RemoveContainer" containerID="83ea666e82ffc032b51fb419fe2f3e2cbf43bf29cd27389317204ca318397904" Oct 01 13:16:52 crc kubenswrapper[4851]: E1001 13:16:52.105214 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83ea666e82ffc032b51fb419fe2f3e2cbf43bf29cd27389317204ca318397904\": container with ID starting with 83ea666e82ffc032b51fb419fe2f3e2cbf43bf29cd27389317204ca318397904 not found: ID does not exist" containerID="83ea666e82ffc032b51fb419fe2f3e2cbf43bf29cd27389317204ca318397904" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.105258 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ea666e82ffc032b51fb419fe2f3e2cbf43bf29cd27389317204ca318397904"} err="failed to get container status \"83ea666e82ffc032b51fb419fe2f3e2cbf43bf29cd27389317204ca318397904\": rpc error: code = NotFound desc = could not find container \"83ea666e82ffc032b51fb419fe2f3e2cbf43bf29cd27389317204ca318397904\": container with ID starting with 83ea666e82ffc032b51fb419fe2f3e2cbf43bf29cd27389317204ca318397904 not found: ID does not exist" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.112356 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.121517 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:16:52 crc kubenswrapper[4851]: E1001 13:16:52.121875 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c53639-696f-4d10-a297-7173dd3b394f" containerName="setup-container" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.121891 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c53639-696f-4d10-a297-7173dd3b394f" containerName="setup-container" Oct 01 13:16:52 crc kubenswrapper[4851]: E1001 13:16:52.121932 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c53639-696f-4d10-a297-7173dd3b394f" containerName="rabbitmq" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.121938 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c53639-696f-4d10-a297-7173dd3b394f" containerName="rabbitmq" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.122101 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c53639-696f-4d10-a297-7173dd3b394f" containerName="rabbitmq" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.123311 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.126010 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.126032 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.126102 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-dqh5p" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.126170 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.126406 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.126896 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.127216 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.144840 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.228957 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3dc04e94-f66a-4937-86d2-24def7247794-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.229049 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3dc04e94-f66a-4937-86d2-24def7247794-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.229099 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3dc04e94-f66a-4937-86d2-24def7247794-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.229124 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3dc04e94-f66a-4937-86d2-24def7247794-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.229180 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.229208 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3dc04e94-f66a-4937-86d2-24def7247794-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.229393 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4btb\" (UniqueName: \"kubernetes.io/projected/3dc04e94-f66a-4937-86d2-24def7247794-kube-api-access-w4btb\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.229513 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3dc04e94-f66a-4937-86d2-24def7247794-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.229572 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3dc04e94-f66a-4937-86d2-24def7247794-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.229644 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3dc04e94-f66a-4937-86d2-24def7247794-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.229710 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3dc04e94-f66a-4937-86d2-24def7247794-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.331326 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3dc04e94-f66a-4937-86d2-24def7247794-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.331391 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3dc04e94-f66a-4937-86d2-24def7247794-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.331413 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3dc04e94-f66a-4937-86d2-24def7247794-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.331446 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.331472 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3dc04e94-f66a-4937-86d2-24def7247794-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.331538 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4btb\" (UniqueName: \"kubernetes.io/projected/3dc04e94-f66a-4937-86d2-24def7247794-kube-api-access-w4btb\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.331566 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3dc04e94-f66a-4937-86d2-24def7247794-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.331590 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3dc04e94-f66a-4937-86d2-24def7247794-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.331619 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3dc04e94-f66a-4937-86d2-24def7247794-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.331636 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3dc04e94-f66a-4937-86d2-24def7247794-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.331671 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3dc04e94-f66a-4937-86d2-24def7247794-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.331899 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.332146 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3dc04e94-f66a-4937-86d2-24def7247794-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.333014 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3dc04e94-f66a-4937-86d2-24def7247794-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.333038 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3dc04e94-f66a-4937-86d2-24def7247794-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.333272 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3dc04e94-f66a-4937-86d2-24def7247794-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.333725 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3dc04e94-f66a-4937-86d2-24def7247794-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.336310 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3dc04e94-f66a-4937-86d2-24def7247794-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.336581 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3dc04e94-f66a-4937-86d2-24def7247794-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.336652 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3dc04e94-f66a-4937-86d2-24def7247794-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.338463 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3dc04e94-f66a-4937-86d2-24def7247794-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.346675 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="712c3704-a775-4fac-81d6-9aa9cfdc48ef" path="/var/lib/kubelet/pods/712c3704-a775-4fac-81d6-9aa9cfdc48ef/volumes" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.349687 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c53639-696f-4d10-a297-7173dd3b394f" path="/var/lib/kubelet/pods/95c53639-696f-4d10-a297-7173dd3b394f/volumes" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.363067 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4btb\" (UniqueName: \"kubernetes.io/projected/3dc04e94-f66a-4937-86d2-24def7247794-kube-api-access-w4btb\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.372488 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dc04e94-f66a-4937-86d2-24def7247794\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.453990 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.895997 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:16:52 crc kubenswrapper[4851]: I1001 13:16:52.982093 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3dc04e94-f66a-4937-86d2-24def7247794","Type":"ContainerStarted","Data":"d951ca4305e1265155be3d5a29b391958799e73fdc533583c3f890eb7ca4271e"} Oct 01 13:16:54 crc kubenswrapper[4851]: I1001 13:16:54.000235 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e","Type":"ContainerStarted","Data":"2ae1a7e9c17889412cebc7c7cbb00d8523a34533d94910317e518bc56b5a27cc"} Oct 01 13:16:55 crc kubenswrapper[4851]: I1001 13:16:55.012942 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3dc04e94-f66a-4937-86d2-24def7247794","Type":"ContainerStarted","Data":"8c9577d8cb562ce4f8e7c62e531afedec52f17beadd68d3d53eff0f72f12dd28"} Oct 01 13:17:00 crc kubenswrapper[4851]: I1001 13:17:00.791534 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-696b555f9-rrxdl"] Oct 01 13:17:00 crc kubenswrapper[4851]: I1001 13:17:00.793488 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:00 crc kubenswrapper[4851]: I1001 13:17:00.795722 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 01 13:17:00 crc kubenswrapper[4851]: I1001 13:17:00.810235 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-696b555f9-rrxdl"] Oct 01 13:17:00 crc kubenswrapper[4851]: I1001 13:17:00.945013 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnwn5\" (UniqueName: \"kubernetes.io/projected/2c35b1e3-81f3-400d-944b-e150f959dc31-kube-api-access-lnwn5\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:00 crc kubenswrapper[4851]: I1001 13:17:00.945082 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-ovsdbserver-sb\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:00 crc kubenswrapper[4851]: I1001 13:17:00.945197 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-dns-svc\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:00 crc kubenswrapper[4851]: I1001 13:17:00.945223 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-openstack-edpm-ipam\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:00 crc kubenswrapper[4851]: I1001 13:17:00.945260 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-dns-swift-storage-0\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:00 crc kubenswrapper[4851]: I1001 13:17:00.945281 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-ovsdbserver-nb\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:00 crc kubenswrapper[4851]: I1001 13:17:00.945307 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-config\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.046912 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-dns-swift-storage-0\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.047006 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-ovsdbserver-nb\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.047059 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-config\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.047168 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnwn5\" (UniqueName: \"kubernetes.io/projected/2c35b1e3-81f3-400d-944b-e150f959dc31-kube-api-access-lnwn5\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.047251 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-ovsdbserver-sb\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.047401 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-dns-svc\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.047446 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-openstack-edpm-ipam\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.048254 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-ovsdbserver-nb\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.048394 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-dns-swift-storage-0\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.048454 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-ovsdbserver-sb\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.048456 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-config\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.048714 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-dns-svc\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.048877 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-openstack-edpm-ipam\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.070317 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnwn5\" (UniqueName: \"kubernetes.io/projected/2c35b1e3-81f3-400d-944b-e150f959dc31-kube-api-access-lnwn5\") pod \"dnsmasq-dns-696b555f9-rrxdl\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.110368 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:01 crc kubenswrapper[4851]: I1001 13:17:01.595783 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-696b555f9-rrxdl"] Oct 01 13:17:01 crc kubenswrapper[4851]: W1001 13:17:01.597889 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c35b1e3_81f3_400d_944b_e150f959dc31.slice/crio-2816e67c80ef12029fa876e98a6516eda6c80e63176684ef7869d725274377b5 WatchSource:0}: Error finding container 2816e67c80ef12029fa876e98a6516eda6c80e63176684ef7869d725274377b5: Status 404 returned error can't find the container with id 2816e67c80ef12029fa876e98a6516eda6c80e63176684ef7869d725274377b5 Oct 01 13:17:02 crc kubenswrapper[4851]: I1001 13:17:02.109561 4851 generic.go:334] "Generic (PLEG): container finished" podID="2c35b1e3-81f3-400d-944b-e150f959dc31" containerID="da15fdf2e5fea329896ab7f98e490c5e3b80ae09437f6a7d5aa8a7bcc2de7eca" exitCode=0 Oct 01 13:17:02 crc kubenswrapper[4851]: I1001 13:17:02.109604 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696b555f9-rrxdl" event={"ID":"2c35b1e3-81f3-400d-944b-e150f959dc31","Type":"ContainerDied","Data":"da15fdf2e5fea329896ab7f98e490c5e3b80ae09437f6a7d5aa8a7bcc2de7eca"} Oct 01 13:17:02 crc kubenswrapper[4851]: I1001 13:17:02.109633 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696b555f9-rrxdl" event={"ID":"2c35b1e3-81f3-400d-944b-e150f959dc31","Type":"ContainerStarted","Data":"2816e67c80ef12029fa876e98a6516eda6c80e63176684ef7869d725274377b5"} Oct 01 13:17:03 crc kubenswrapper[4851]: I1001 13:17:03.122451 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696b555f9-rrxdl" event={"ID":"2c35b1e3-81f3-400d-944b-e150f959dc31","Type":"ContainerStarted","Data":"fe2ace37468bb514221303279eb3e2ea7b1ddeff36cfda3368cd36d586722216"} Oct 01 13:17:03 crc kubenswrapper[4851]: I1001 13:17:03.122844 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:03 crc kubenswrapper[4851]: I1001 13:17:03.158065 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-696b555f9-rrxdl" podStartSLOduration=3.158040619 podStartE2EDuration="3.158040619s" podCreationTimestamp="2025-10-01 13:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:17:03.156900066 +0000 UTC m=+1431.502017582" watchObservedRunningTime="2025-10-01 13:17:03.158040619 +0000 UTC m=+1431.503158115" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.111785 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.192671 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74745b66cf-sh8zk"] Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.192946 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" podUID="10f182d6-67d2-4878-afcc-3e38d1f689aa" containerName="dnsmasq-dns" containerID="cri-o://311962d94f7a32ac08ddb66e50fd2304671fec1892f4c2ca64d9941e19d957df" gracePeriod=10 Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.377179 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-659b5fb8c5-5wqbr"] Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.378786 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.391131 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-659b5fb8c5-5wqbr"] Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.485441 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-ovsdbserver-sb\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.485490 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-dns-svc\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.485531 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-dns-swift-storage-0\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.485819 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-config\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.485910 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f7ds\" (UniqueName: \"kubernetes.io/projected/73149e01-4273-4703-a6bd-0b44c3ce5aad-kube-api-access-8f7ds\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.486092 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-openstack-edpm-ipam\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.486186 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-ovsdbserver-nb\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.589346 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-openstack-edpm-ipam\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.589445 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-ovsdbserver-nb\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.589674 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-ovsdbserver-sb\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.589801 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-dns-svc\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.589863 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-dns-swift-storage-0\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.590068 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-config\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.590177 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f7ds\" (UniqueName: \"kubernetes.io/projected/73149e01-4273-4703-a6bd-0b44c3ce5aad-kube-api-access-8f7ds\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.590318 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-ovsdbserver-nb\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.590471 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-dns-svc\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.590634 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-ovsdbserver-sb\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.590785 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-dns-swift-storage-0\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.590878 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-config\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.590908 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/73149e01-4273-4703-a6bd-0b44c3ce5aad-openstack-edpm-ipam\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.626249 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f7ds\" (UniqueName: \"kubernetes.io/projected/73149e01-4273-4703-a6bd-0b44c3ce5aad-kube-api-access-8f7ds\") pod \"dnsmasq-dns-659b5fb8c5-5wqbr\" (UID: \"73149e01-4273-4703-a6bd-0b44c3ce5aad\") " pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.712015 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.832002 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.900975 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dcx8\" (UniqueName: \"kubernetes.io/projected/10f182d6-67d2-4878-afcc-3e38d1f689aa-kube-api-access-7dcx8\") pod \"10f182d6-67d2-4878-afcc-3e38d1f689aa\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.901009 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-dns-swift-storage-0\") pod \"10f182d6-67d2-4878-afcc-3e38d1f689aa\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.901073 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-ovsdbserver-sb\") pod \"10f182d6-67d2-4878-afcc-3e38d1f689aa\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.901213 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-ovsdbserver-nb\") pod \"10f182d6-67d2-4878-afcc-3e38d1f689aa\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.901265 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-config\") pod \"10f182d6-67d2-4878-afcc-3e38d1f689aa\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.901303 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-dns-svc\") pod \"10f182d6-67d2-4878-afcc-3e38d1f689aa\" (UID: \"10f182d6-67d2-4878-afcc-3e38d1f689aa\") " Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.908630 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f182d6-67d2-4878-afcc-3e38d1f689aa-kube-api-access-7dcx8" (OuterVolumeSpecName: "kube-api-access-7dcx8") pod "10f182d6-67d2-4878-afcc-3e38d1f689aa" (UID: "10f182d6-67d2-4878-afcc-3e38d1f689aa"). InnerVolumeSpecName "kube-api-access-7dcx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.966726 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "10f182d6-67d2-4878-afcc-3e38d1f689aa" (UID: "10f182d6-67d2-4878-afcc-3e38d1f689aa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.968750 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "10f182d6-67d2-4878-afcc-3e38d1f689aa" (UID: "10f182d6-67d2-4878-afcc-3e38d1f689aa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.975617 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-config" (OuterVolumeSpecName: "config") pod "10f182d6-67d2-4878-afcc-3e38d1f689aa" (UID: "10f182d6-67d2-4878-afcc-3e38d1f689aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.982779 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "10f182d6-67d2-4878-afcc-3e38d1f689aa" (UID: "10f182d6-67d2-4878-afcc-3e38d1f689aa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:17:11 crc kubenswrapper[4851]: I1001 13:17:11.985374 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "10f182d6-67d2-4878-afcc-3e38d1f689aa" (UID: "10f182d6-67d2-4878-afcc-3e38d1f689aa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.004489 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.004530 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.004540 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.004550 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dcx8\" (UniqueName: \"kubernetes.io/projected/10f182d6-67d2-4878-afcc-3e38d1f689aa-kube-api-access-7dcx8\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.004560 4851 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.004570 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f182d6-67d2-4878-afcc-3e38d1f689aa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.181834 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-659b5fb8c5-5wqbr"] Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.251680 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" event={"ID":"73149e01-4273-4703-a6bd-0b44c3ce5aad","Type":"ContainerStarted","Data":"6b9582859bcea7b1174657380b3b7fe5156fe83032a5b326a15b043220f9180d"} Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.253878 4851 generic.go:334] "Generic (PLEG): container finished" podID="10f182d6-67d2-4878-afcc-3e38d1f689aa" containerID="311962d94f7a32ac08ddb66e50fd2304671fec1892f4c2ca64d9941e19d957df" exitCode=0 Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.253919 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" event={"ID":"10f182d6-67d2-4878-afcc-3e38d1f689aa","Type":"ContainerDied","Data":"311962d94f7a32ac08ddb66e50fd2304671fec1892f4c2ca64d9941e19d957df"} Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.253946 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" event={"ID":"10f182d6-67d2-4878-afcc-3e38d1f689aa","Type":"ContainerDied","Data":"3a263e4100487e5bd2f83cb2095497991de5bc438178e14c967f60d592303115"} Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.253963 4851 scope.go:117] "RemoveContainer" containerID="311962d94f7a32ac08ddb66e50fd2304671fec1892f4c2ca64d9941e19d957df" Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.254085 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74745b66cf-sh8zk" Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.290084 4851 scope.go:117] "RemoveContainer" containerID="42c86437fdfbd9a24222fa7007a183b2f7221b67a0de62b064dceca6b3020985" Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.293079 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74745b66cf-sh8zk"] Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.313830 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74745b66cf-sh8zk"] Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.316733 4851 scope.go:117] "RemoveContainer" containerID="311962d94f7a32ac08ddb66e50fd2304671fec1892f4c2ca64d9941e19d957df" Oct 01 13:17:12 crc kubenswrapper[4851]: E1001 13:17:12.317197 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"311962d94f7a32ac08ddb66e50fd2304671fec1892f4c2ca64d9941e19d957df\": container with ID starting with 311962d94f7a32ac08ddb66e50fd2304671fec1892f4c2ca64d9941e19d957df not found: ID does not exist" containerID="311962d94f7a32ac08ddb66e50fd2304671fec1892f4c2ca64d9941e19d957df" Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.317306 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"311962d94f7a32ac08ddb66e50fd2304671fec1892f4c2ca64d9941e19d957df"} err="failed to get container status \"311962d94f7a32ac08ddb66e50fd2304671fec1892f4c2ca64d9941e19d957df\": rpc error: code = NotFound desc = could not find container \"311962d94f7a32ac08ddb66e50fd2304671fec1892f4c2ca64d9941e19d957df\": container with ID starting with 311962d94f7a32ac08ddb66e50fd2304671fec1892f4c2ca64d9941e19d957df not found: ID does not exist" Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.317415 4851 scope.go:117] "RemoveContainer" containerID="42c86437fdfbd9a24222fa7007a183b2f7221b67a0de62b064dceca6b3020985" Oct 01 13:17:12 crc kubenswrapper[4851]: E1001 13:17:12.317920 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42c86437fdfbd9a24222fa7007a183b2f7221b67a0de62b064dceca6b3020985\": container with ID starting with 42c86437fdfbd9a24222fa7007a183b2f7221b67a0de62b064dceca6b3020985 not found: ID does not exist" containerID="42c86437fdfbd9a24222fa7007a183b2f7221b67a0de62b064dceca6b3020985" Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.318018 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c86437fdfbd9a24222fa7007a183b2f7221b67a0de62b064dceca6b3020985"} err="failed to get container status \"42c86437fdfbd9a24222fa7007a183b2f7221b67a0de62b064dceca6b3020985\": rpc error: code = NotFound desc = could not find container \"42c86437fdfbd9a24222fa7007a183b2f7221b67a0de62b064dceca6b3020985\": container with ID starting with 42c86437fdfbd9a24222fa7007a183b2f7221b67a0de62b064dceca6b3020985 not found: ID does not exist" Oct 01 13:17:12 crc kubenswrapper[4851]: I1001 13:17:12.353318 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10f182d6-67d2-4878-afcc-3e38d1f689aa" path="/var/lib/kubelet/pods/10f182d6-67d2-4878-afcc-3e38d1f689aa/volumes" Oct 01 13:17:13 crc kubenswrapper[4851]: I1001 13:17:13.273273 4851 generic.go:334] "Generic (PLEG): container finished" podID="73149e01-4273-4703-a6bd-0b44c3ce5aad" containerID="723786c54ae3a5c2e5c250d18a9e245443e3f0672d18afc458f6231bad9391a4" exitCode=0 Oct 01 13:17:13 crc kubenswrapper[4851]: I1001 13:17:13.273675 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" event={"ID":"73149e01-4273-4703-a6bd-0b44c3ce5aad","Type":"ContainerDied","Data":"723786c54ae3a5c2e5c250d18a9e245443e3f0672d18afc458f6231bad9391a4"} Oct 01 13:17:14 crc kubenswrapper[4851]: I1001 13:17:14.287663 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" event={"ID":"73149e01-4273-4703-a6bd-0b44c3ce5aad","Type":"ContainerStarted","Data":"1403b6854bc44f808a2ed98fbd33d6d116d38421fada144c74ff2863e279d8f3"} Oct 01 13:17:14 crc kubenswrapper[4851]: I1001 13:17:14.288193 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:14 crc kubenswrapper[4851]: I1001 13:17:14.319007 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" podStartSLOduration=3.318982945 podStartE2EDuration="3.318982945s" podCreationTimestamp="2025-10-01 13:17:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:17:14.315156706 +0000 UTC m=+1442.660274212" watchObservedRunningTime="2025-10-01 13:17:14.318982945 +0000 UTC m=+1442.664100461" Oct 01 13:17:14 crc kubenswrapper[4851]: I1001 13:17:14.763333 4851 scope.go:117] "RemoveContainer" containerID="63cbe7c7b8de78d4e986d60e321b3deb9b259065e64592c261935951e57ecc35" Oct 01 13:17:14 crc kubenswrapper[4851]: I1001 13:17:14.791210 4851 scope.go:117] "RemoveContainer" containerID="30e5d47cf1ecb5ce3426302c5a3b0bedb12f82a2d45145740bfbfaec971d8312" Oct 01 13:17:14 crc kubenswrapper[4851]: I1001 13:17:14.858905 4851 scope.go:117] "RemoveContainer" containerID="29621214da02542afefea5057c85fdba14ff3c75adbd9b26abad60c55e5a212c" Oct 01 13:17:21 crc kubenswrapper[4851]: I1001 13:17:21.713861 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-659b5fb8c5-5wqbr" Oct 01 13:17:21 crc kubenswrapper[4851]: I1001 13:17:21.795050 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-696b555f9-rrxdl"] Oct 01 13:17:21 crc kubenswrapper[4851]: I1001 13:17:21.795293 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-696b555f9-rrxdl" podUID="2c35b1e3-81f3-400d-944b-e150f959dc31" containerName="dnsmasq-dns" containerID="cri-o://fe2ace37468bb514221303279eb3e2ea7b1ddeff36cfda3368cd36d586722216" gracePeriod=10 Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.286011 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.338878 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnwn5\" (UniqueName: \"kubernetes.io/projected/2c35b1e3-81f3-400d-944b-e150f959dc31-kube-api-access-lnwn5\") pod \"2c35b1e3-81f3-400d-944b-e150f959dc31\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.338933 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-ovsdbserver-nb\") pod \"2c35b1e3-81f3-400d-944b-e150f959dc31\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.339120 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-ovsdbserver-sb\") pod \"2c35b1e3-81f3-400d-944b-e150f959dc31\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.355358 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c35b1e3-81f3-400d-944b-e150f959dc31-kube-api-access-lnwn5" (OuterVolumeSpecName: "kube-api-access-lnwn5") pod "2c35b1e3-81f3-400d-944b-e150f959dc31" (UID: "2c35b1e3-81f3-400d-944b-e150f959dc31"). InnerVolumeSpecName "kube-api-access-lnwn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.403351 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c35b1e3-81f3-400d-944b-e150f959dc31" (UID: "2c35b1e3-81f3-400d-944b-e150f959dc31"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.403412 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c35b1e3-81f3-400d-944b-e150f959dc31" (UID: "2c35b1e3-81f3-400d-944b-e150f959dc31"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.417523 4851 generic.go:334] "Generic (PLEG): container finished" podID="2c35b1e3-81f3-400d-944b-e150f959dc31" containerID="fe2ace37468bb514221303279eb3e2ea7b1ddeff36cfda3368cd36d586722216" exitCode=0 Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.417598 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696b555f9-rrxdl" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.439113 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696b555f9-rrxdl" event={"ID":"2c35b1e3-81f3-400d-944b-e150f959dc31","Type":"ContainerDied","Data":"fe2ace37468bb514221303279eb3e2ea7b1ddeff36cfda3368cd36d586722216"} Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.439154 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696b555f9-rrxdl" event={"ID":"2c35b1e3-81f3-400d-944b-e150f959dc31","Type":"ContainerDied","Data":"2816e67c80ef12029fa876e98a6516eda6c80e63176684ef7869d725274377b5"} Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.439171 4851 scope.go:117] "RemoveContainer" containerID="fe2ace37468bb514221303279eb3e2ea7b1ddeff36cfda3368cd36d586722216" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.440546 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-dns-swift-storage-0\") pod \"2c35b1e3-81f3-400d-944b-e150f959dc31\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.440638 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-openstack-edpm-ipam\") pod \"2c35b1e3-81f3-400d-944b-e150f959dc31\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.440711 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-config\") pod \"2c35b1e3-81f3-400d-944b-e150f959dc31\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.441039 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-dns-svc\") pod \"2c35b1e3-81f3-400d-944b-e150f959dc31\" (UID: \"2c35b1e3-81f3-400d-944b-e150f959dc31\") " Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.441585 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.441607 4851 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.441620 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnwn5\" (UniqueName: \"kubernetes.io/projected/2c35b1e3-81f3-400d-944b-e150f959dc31-kube-api-access-lnwn5\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.461051 4851 scope.go:117] "RemoveContainer" containerID="da15fdf2e5fea329896ab7f98e490c5e3b80ae09437f6a7d5aa8a7bcc2de7eca" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.482549 4851 scope.go:117] "RemoveContainer" containerID="fe2ace37468bb514221303279eb3e2ea7b1ddeff36cfda3368cd36d586722216" Oct 01 13:17:22 crc kubenswrapper[4851]: E1001 13:17:22.483044 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe2ace37468bb514221303279eb3e2ea7b1ddeff36cfda3368cd36d586722216\": container with ID starting with fe2ace37468bb514221303279eb3e2ea7b1ddeff36cfda3368cd36d586722216 not found: ID does not exist" containerID="fe2ace37468bb514221303279eb3e2ea7b1ddeff36cfda3368cd36d586722216" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.483110 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe2ace37468bb514221303279eb3e2ea7b1ddeff36cfda3368cd36d586722216"} err="failed to get container status \"fe2ace37468bb514221303279eb3e2ea7b1ddeff36cfda3368cd36d586722216\": rpc error: code = NotFound desc = could not find container \"fe2ace37468bb514221303279eb3e2ea7b1ddeff36cfda3368cd36d586722216\": container with ID starting with fe2ace37468bb514221303279eb3e2ea7b1ddeff36cfda3368cd36d586722216 not found: ID does not exist" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.483272 4851 scope.go:117] "RemoveContainer" containerID="da15fdf2e5fea329896ab7f98e490c5e3b80ae09437f6a7d5aa8a7bcc2de7eca" Oct 01 13:17:22 crc kubenswrapper[4851]: E1001 13:17:22.483796 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da15fdf2e5fea329896ab7f98e490c5e3b80ae09437f6a7d5aa8a7bcc2de7eca\": container with ID starting with da15fdf2e5fea329896ab7f98e490c5e3b80ae09437f6a7d5aa8a7bcc2de7eca not found: ID does not exist" containerID="da15fdf2e5fea329896ab7f98e490c5e3b80ae09437f6a7d5aa8a7bcc2de7eca" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.483842 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da15fdf2e5fea329896ab7f98e490c5e3b80ae09437f6a7d5aa8a7bcc2de7eca"} err="failed to get container status \"da15fdf2e5fea329896ab7f98e490c5e3b80ae09437f6a7d5aa8a7bcc2de7eca\": rpc error: code = NotFound desc = could not find container \"da15fdf2e5fea329896ab7f98e490c5e3b80ae09437f6a7d5aa8a7bcc2de7eca\": container with ID starting with da15fdf2e5fea329896ab7f98e490c5e3b80ae09437f6a7d5aa8a7bcc2de7eca not found: ID does not exist" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.491380 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "2c35b1e3-81f3-400d-944b-e150f959dc31" (UID: "2c35b1e3-81f3-400d-944b-e150f959dc31"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.495148 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c35b1e3-81f3-400d-944b-e150f959dc31" (UID: "2c35b1e3-81f3-400d-944b-e150f959dc31"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.500465 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2c35b1e3-81f3-400d-944b-e150f959dc31" (UID: "2c35b1e3-81f3-400d-944b-e150f959dc31"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.507669 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-config" (OuterVolumeSpecName: "config") pod "2c35b1e3-81f3-400d-944b-e150f959dc31" (UID: "2c35b1e3-81f3-400d-944b-e150f959dc31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.544633 4851 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.544680 4851 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.544703 4851 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.544721 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c35b1e3-81f3-400d-944b-e150f959dc31-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.762177 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-696b555f9-rrxdl"] Oct 01 13:17:22 crc kubenswrapper[4851]: I1001 13:17:22.771748 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-696b555f9-rrxdl"] Oct 01 13:17:24 crc kubenswrapper[4851]: I1001 13:17:24.345129 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c35b1e3-81f3-400d-944b-e150f959dc31" path="/var/lib/kubelet/pods/2c35b1e3-81f3-400d-944b-e150f959dc31/volumes" Oct 01 13:17:27 crc kubenswrapper[4851]: I1001 13:17:27.490431 4851 generic.go:334] "Generic (PLEG): container finished" podID="73f3b1c2-f1c0-47b8-bf31-4d2a185c852e" containerID="2ae1a7e9c17889412cebc7c7cbb00d8523a34533d94910317e518bc56b5a27cc" exitCode=0 Oct 01 13:17:27 crc kubenswrapper[4851]: I1001 13:17:27.490496 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e","Type":"ContainerDied","Data":"2ae1a7e9c17889412cebc7c7cbb00d8523a34533d94910317e518bc56b5a27cc"} Oct 01 13:17:28 crc kubenswrapper[4851]: I1001 13:17:28.504942 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73f3b1c2-f1c0-47b8-bf31-4d2a185c852e","Type":"ContainerStarted","Data":"42bcc8c43ec37f565ce4c6ccbcbe437a84df35a27fe91a2a7d9259e73ec97a67"} Oct 01 13:17:28 crc kubenswrapper[4851]: I1001 13:17:28.505474 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 13:17:28 crc kubenswrapper[4851]: I1001 13:17:28.506988 4851 generic.go:334] "Generic (PLEG): container finished" podID="3dc04e94-f66a-4937-86d2-24def7247794" containerID="8c9577d8cb562ce4f8e7c62e531afedec52f17beadd68d3d53eff0f72f12dd28" exitCode=0 Oct 01 13:17:28 crc kubenswrapper[4851]: I1001 13:17:28.507040 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3dc04e94-f66a-4937-86d2-24def7247794","Type":"ContainerDied","Data":"8c9577d8cb562ce4f8e7c62e531afedec52f17beadd68d3d53eff0f72f12dd28"} Oct 01 13:17:28 crc kubenswrapper[4851]: I1001 13:17:28.542227 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.542204078 podStartE2EDuration="37.542204078s" podCreationTimestamp="2025-10-01 13:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:17:28.53173556 +0000 UTC m=+1456.876853076" watchObservedRunningTime="2025-10-01 13:17:28.542204078 +0000 UTC m=+1456.887321564" Oct 01 13:17:29 crc kubenswrapper[4851]: I1001 13:17:29.518370 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3dc04e94-f66a-4937-86d2-24def7247794","Type":"ContainerStarted","Data":"2046a59f5d94edb59361a7a1ae4819517d2fa24efd703255640c889b02ed9b0b"} Oct 01 13:17:29 crc kubenswrapper[4851]: I1001 13:17:29.519639 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:17:29 crc kubenswrapper[4851]: I1001 13:17:29.550849 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.550831398 podStartE2EDuration="37.550831398s" podCreationTimestamp="2025-10-01 13:16:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:17:29.545736533 +0000 UTC m=+1457.890854049" watchObservedRunningTime="2025-10-01 13:17:29.550831398 +0000 UTC m=+1457.895948894" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.504776 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c9f8x"] Oct 01 13:17:36 crc kubenswrapper[4851]: E1001 13:17:36.505791 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c35b1e3-81f3-400d-944b-e150f959dc31" containerName="init" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.505807 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c35b1e3-81f3-400d-944b-e150f959dc31" containerName="init" Oct 01 13:17:36 crc kubenswrapper[4851]: E1001 13:17:36.505833 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f182d6-67d2-4878-afcc-3e38d1f689aa" containerName="dnsmasq-dns" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.505840 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f182d6-67d2-4878-afcc-3e38d1f689aa" containerName="dnsmasq-dns" Oct 01 13:17:36 crc kubenswrapper[4851]: E1001 13:17:36.505850 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f182d6-67d2-4878-afcc-3e38d1f689aa" containerName="init" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.505858 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f182d6-67d2-4878-afcc-3e38d1f689aa" containerName="init" Oct 01 13:17:36 crc kubenswrapper[4851]: E1001 13:17:36.505900 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c35b1e3-81f3-400d-944b-e150f959dc31" containerName="dnsmasq-dns" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.505907 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c35b1e3-81f3-400d-944b-e150f959dc31" containerName="dnsmasq-dns" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.506150 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c35b1e3-81f3-400d-944b-e150f959dc31" containerName="dnsmasq-dns" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.506167 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="10f182d6-67d2-4878-afcc-3e38d1f689aa" containerName="dnsmasq-dns" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.508075 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.518589 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c9f8x"] Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.634192 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-utilities\") pod \"certified-operators-c9f8x\" (UID: \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\") " pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.634257 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-catalog-content\") pod \"certified-operators-c9f8x\" (UID: \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\") " pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.634366 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcglb\" (UniqueName: \"kubernetes.io/projected/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-kube-api-access-hcglb\") pod \"certified-operators-c9f8x\" (UID: \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\") " pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.736974 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-catalog-content\") pod \"certified-operators-c9f8x\" (UID: \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\") " pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.737072 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcglb\" (UniqueName: \"kubernetes.io/projected/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-kube-api-access-hcglb\") pod \"certified-operators-c9f8x\" (UID: \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\") " pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.737266 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-utilities\") pod \"certified-operators-c9f8x\" (UID: \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\") " pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.737593 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-catalog-content\") pod \"certified-operators-c9f8x\" (UID: \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\") " pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.737680 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-utilities\") pod \"certified-operators-c9f8x\" (UID: \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\") " pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.765360 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcglb\" (UniqueName: \"kubernetes.io/projected/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-kube-api-access-hcglb\") pod \"certified-operators-c9f8x\" (UID: \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\") " pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:36 crc kubenswrapper[4851]: I1001 13:17:36.827767 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:37 crc kubenswrapper[4851]: I1001 13:17:37.361656 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c9f8x"] Oct 01 13:17:37 crc kubenswrapper[4851]: I1001 13:17:37.591247 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9f8x" event={"ID":"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f","Type":"ContainerStarted","Data":"80f8188c6aa98bafc24446b6e8cbf653f048ed249ef250688d3f06f334789dfe"} Oct 01 13:17:37 crc kubenswrapper[4851]: I1001 13:17:37.591301 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9f8x" event={"ID":"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f","Type":"ContainerStarted","Data":"f86ac635665850d77761cf9d834b1b0dfea37942fd07f2441dbefd32d401b5b2"} Oct 01 13:17:38 crc kubenswrapper[4851]: I1001 13:17:38.605907 4851 generic.go:334] "Generic (PLEG): container finished" podID="8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" containerID="80f8188c6aa98bafc24446b6e8cbf653f048ed249ef250688d3f06f334789dfe" exitCode=0 Oct 01 13:17:38 crc kubenswrapper[4851]: I1001 13:17:38.606314 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9f8x" event={"ID":"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f","Type":"ContainerDied","Data":"80f8188c6aa98bafc24446b6e8cbf653f048ed249ef250688d3f06f334789dfe"} Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.625439 4851 generic.go:334] "Generic (PLEG): container finished" podID="8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" containerID="d7a0e0238d949e41785e440b0b98c00a8ebd2e69a296e5db4624e3c49d60d44b" exitCode=0 Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.626918 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9f8x" event={"ID":"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f","Type":"ContainerDied","Data":"d7a0e0238d949e41785e440b0b98c00a8ebd2e69a296e5db4624e3c49d60d44b"} Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.644459 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47"] Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.646555 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.648261 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.648748 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.652970 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.654024 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.663626 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47"] Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.720626 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw6zk\" (UniqueName: \"kubernetes.io/projected/9eaff7ea-f90a-4cfa-8850-cf308591ca11-kube-api-access-mw6zk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.720728 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.720809 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.720945 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.822181 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.822398 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.822464 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw6zk\" (UniqueName: \"kubernetes.io/projected/9eaff7ea-f90a-4cfa-8850-cf308591ca11-kube-api-access-mw6zk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.822525 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.827885 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.828226 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.828237 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.840699 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw6zk\" (UniqueName: \"kubernetes.io/projected/9eaff7ea-f90a-4cfa-8850-cf308591ca11-kube-api-access-mw6zk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:17:40 crc kubenswrapper[4851]: I1001 13:17:40.971073 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:17:41 crc kubenswrapper[4851]: I1001 13:17:41.382752 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 13:17:41 crc kubenswrapper[4851]: I1001 13:17:41.625393 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47"] Oct 01 13:17:41 crc kubenswrapper[4851]: W1001 13:17:41.640700 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9eaff7ea_f90a_4cfa_8850_cf308591ca11.slice/crio-585ab23e6e605b7d4ca24679e346dd205207b3891ab5fe1688ba2ee60f604124 WatchSource:0}: Error finding container 585ab23e6e605b7d4ca24679e346dd205207b3891ab5fe1688ba2ee60f604124: Status 404 returned error can't find the container with id 585ab23e6e605b7d4ca24679e346dd205207b3891ab5fe1688ba2ee60f604124 Oct 01 13:17:42 crc kubenswrapper[4851]: I1001 13:17:42.456795 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:17:42 crc kubenswrapper[4851]: I1001 13:17:42.684711 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9f8x" event={"ID":"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f","Type":"ContainerStarted","Data":"06acd5b0597f962ecef7763d8735d55b1d4838fd44f9e689c1019c55a24421ce"} Oct 01 13:17:42 crc kubenswrapper[4851]: I1001 13:17:42.687910 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" event={"ID":"9eaff7ea-f90a-4cfa-8850-cf308591ca11","Type":"ContainerStarted","Data":"585ab23e6e605b7d4ca24679e346dd205207b3891ab5fe1688ba2ee60f604124"} Oct 01 13:17:42 crc kubenswrapper[4851]: I1001 13:17:42.700059 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c9f8x" podStartSLOduration=3.726978574 podStartE2EDuration="6.700043638s" podCreationTimestamp="2025-10-01 13:17:36 +0000 UTC" firstStartedPulling="2025-10-01 13:17:38.608528615 +0000 UTC m=+1466.953646101" lastFinishedPulling="2025-10-01 13:17:41.581593679 +0000 UTC m=+1469.926711165" observedRunningTime="2025-10-01 13:17:42.699990436 +0000 UTC m=+1471.045107932" watchObservedRunningTime="2025-10-01 13:17:42.700043638 +0000 UTC m=+1471.045161124" Oct 01 13:17:46 crc kubenswrapper[4851]: I1001 13:17:46.828251 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:46 crc kubenswrapper[4851]: I1001 13:17:46.829195 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:46 crc kubenswrapper[4851]: I1001 13:17:46.904731 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:47 crc kubenswrapper[4851]: I1001 13:17:47.809423 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:47 crc kubenswrapper[4851]: I1001 13:17:47.854973 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c9f8x"] Oct 01 13:17:49 crc kubenswrapper[4851]: I1001 13:17:49.789796 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c9f8x" podUID="8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" containerName="registry-server" containerID="cri-o://06acd5b0597f962ecef7763d8735d55b1d4838fd44f9e689c1019c55a24421ce" gracePeriod=2 Oct 01 13:17:49 crc kubenswrapper[4851]: E1001 13:17:49.976514 4851 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b1ff910_e61d_4c4a_aa74_9b6bde4a059f.slice/crio-conmon-06acd5b0597f962ecef7763d8735d55b1d4838fd44f9e689c1019c55a24421ce.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:17:50 crc kubenswrapper[4851]: I1001 13:17:50.806946 4851 generic.go:334] "Generic (PLEG): container finished" podID="8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" containerID="06acd5b0597f962ecef7763d8735d55b1d4838fd44f9e689c1019c55a24421ce" exitCode=0 Oct 01 13:17:50 crc kubenswrapper[4851]: I1001 13:17:50.806998 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9f8x" event={"ID":"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f","Type":"ContainerDied","Data":"06acd5b0597f962ecef7763d8735d55b1d4838fd44f9e689c1019c55a24421ce"} Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.335065 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.401069 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-catalog-content\") pod \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\" (UID: \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\") " Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.401204 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcglb\" (UniqueName: \"kubernetes.io/projected/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-kube-api-access-hcglb\") pod \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\" (UID: \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\") " Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.405878 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-kube-api-access-hcglb" (OuterVolumeSpecName: "kube-api-access-hcglb") pod "8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" (UID: "8b1ff910-e61d-4c4a-aa74-9b6bde4a059f"). InnerVolumeSpecName "kube-api-access-hcglb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.441269 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" (UID: "8b1ff910-e61d-4c4a-aa74-9b6bde4a059f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.503898 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-utilities\") pod \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\" (UID: \"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f\") " Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.504581 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-utilities" (OuterVolumeSpecName: "utilities") pod "8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" (UID: "8b1ff910-e61d-4c4a-aa74-9b6bde4a059f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.504709 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.504725 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.504739 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcglb\" (UniqueName: \"kubernetes.io/projected/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f-kube-api-access-hcglb\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.820436 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" event={"ID":"9eaff7ea-f90a-4cfa-8850-cf308591ca11","Type":"ContainerStarted","Data":"c51a6cc2fc8cb591576d8f310a01b22116bd03141078be0b4fd6f4d8d351a25f"} Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.826269 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9f8x" event={"ID":"8b1ff910-e61d-4c4a-aa74-9b6bde4a059f","Type":"ContainerDied","Data":"f86ac635665850d77761cf9d834b1b0dfea37942fd07f2441dbefd32d401b5b2"} Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.826342 4851 scope.go:117] "RemoveContainer" containerID="06acd5b0597f962ecef7763d8735d55b1d4838fd44f9e689c1019c55a24421ce" Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.826352 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9f8x" Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.857860 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" podStartSLOduration=2.50994646 podStartE2EDuration="11.857841588s" podCreationTimestamp="2025-10-01 13:17:40 +0000 UTC" firstStartedPulling="2025-10-01 13:17:41.642589597 +0000 UTC m=+1469.987707083" lastFinishedPulling="2025-10-01 13:17:50.990484725 +0000 UTC m=+1479.335602211" observedRunningTime="2025-10-01 13:17:51.851330462 +0000 UTC m=+1480.196447948" watchObservedRunningTime="2025-10-01 13:17:51.857841588 +0000 UTC m=+1480.202959074" Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.865362 4851 scope.go:117] "RemoveContainer" containerID="d7a0e0238d949e41785e440b0b98c00a8ebd2e69a296e5db4624e3c49d60d44b" Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.894573 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c9f8x"] Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.903139 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c9f8x"] Oct 01 13:17:51 crc kubenswrapper[4851]: I1001 13:17:51.932662 4851 scope.go:117] "RemoveContainer" containerID="80f8188c6aa98bafc24446b6e8cbf653f048ed249ef250688d3f06f334789dfe" Oct 01 13:17:52 crc kubenswrapper[4851]: I1001 13:17:52.348854 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" path="/var/lib/kubelet/pods/8b1ff910-e61d-4c4a-aa74-9b6bde4a059f/volumes" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.743414 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wfqdl"] Oct 01 13:17:54 crc kubenswrapper[4851]: E1001 13:17:54.744127 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" containerName="registry-server" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.744140 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" containerName="registry-server" Oct 01 13:17:54 crc kubenswrapper[4851]: E1001 13:17:54.744159 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" containerName="extract-utilities" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.744165 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" containerName="extract-utilities" Oct 01 13:17:54 crc kubenswrapper[4851]: E1001 13:17:54.744184 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" containerName="extract-content" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.744189 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" containerName="extract-content" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.744359 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b1ff910-e61d-4c4a-aa74-9b6bde4a059f" containerName="registry-server" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.746013 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.768463 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wfqdl"] Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.875835 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcfvr\" (UniqueName: \"kubernetes.io/projected/9a542b47-9b01-40aa-ab73-d6067689a6d7-kube-api-access-fcfvr\") pod \"redhat-operators-wfqdl\" (UID: \"9a542b47-9b01-40aa-ab73-d6067689a6d7\") " pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.876135 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a542b47-9b01-40aa-ab73-d6067689a6d7-catalog-content\") pod \"redhat-operators-wfqdl\" (UID: \"9a542b47-9b01-40aa-ab73-d6067689a6d7\") " pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.876184 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a542b47-9b01-40aa-ab73-d6067689a6d7-utilities\") pod \"redhat-operators-wfqdl\" (UID: \"9a542b47-9b01-40aa-ab73-d6067689a6d7\") " pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.978034 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a542b47-9b01-40aa-ab73-d6067689a6d7-catalog-content\") pod \"redhat-operators-wfqdl\" (UID: \"9a542b47-9b01-40aa-ab73-d6067689a6d7\") " pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.978104 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a542b47-9b01-40aa-ab73-d6067689a6d7-utilities\") pod \"redhat-operators-wfqdl\" (UID: \"9a542b47-9b01-40aa-ab73-d6067689a6d7\") " pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.978178 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcfvr\" (UniqueName: \"kubernetes.io/projected/9a542b47-9b01-40aa-ab73-d6067689a6d7-kube-api-access-fcfvr\") pod \"redhat-operators-wfqdl\" (UID: \"9a542b47-9b01-40aa-ab73-d6067689a6d7\") " pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.978576 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a542b47-9b01-40aa-ab73-d6067689a6d7-catalog-content\") pod \"redhat-operators-wfqdl\" (UID: \"9a542b47-9b01-40aa-ab73-d6067689a6d7\") " pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.978839 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a542b47-9b01-40aa-ab73-d6067689a6d7-utilities\") pod \"redhat-operators-wfqdl\" (UID: \"9a542b47-9b01-40aa-ab73-d6067689a6d7\") " pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:17:54 crc kubenswrapper[4851]: I1001 13:17:54.997698 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcfvr\" (UniqueName: \"kubernetes.io/projected/9a542b47-9b01-40aa-ab73-d6067689a6d7-kube-api-access-fcfvr\") pod \"redhat-operators-wfqdl\" (UID: \"9a542b47-9b01-40aa-ab73-d6067689a6d7\") " pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:17:55 crc kubenswrapper[4851]: I1001 13:17:55.073089 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:17:55 crc kubenswrapper[4851]: I1001 13:17:55.561570 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wfqdl"] Oct 01 13:17:55 crc kubenswrapper[4851]: I1001 13:17:55.870993 4851 generic.go:334] "Generic (PLEG): container finished" podID="9a542b47-9b01-40aa-ab73-d6067689a6d7" containerID="2da09483c9fdb31d1fd07a911e4557524c1a177af5bbb5a2bb4e63420f5f6a73" exitCode=0 Oct 01 13:17:55 crc kubenswrapper[4851]: I1001 13:17:55.871052 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfqdl" event={"ID":"9a542b47-9b01-40aa-ab73-d6067689a6d7","Type":"ContainerDied","Data":"2da09483c9fdb31d1fd07a911e4557524c1a177af5bbb5a2bb4e63420f5f6a73"} Oct 01 13:17:55 crc kubenswrapper[4851]: I1001 13:17:55.871319 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfqdl" event={"ID":"9a542b47-9b01-40aa-ab73-d6067689a6d7","Type":"ContainerStarted","Data":"352280349a1f4e433455973bc45e3de1108afe6ece9fe652e951f6e9adedcb73"} Oct 01 13:17:58 crc kubenswrapper[4851]: I1001 13:17:58.929254 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfqdl" event={"ID":"9a542b47-9b01-40aa-ab73-d6067689a6d7","Type":"ContainerStarted","Data":"7607b0678a6e50b5b805d97c8cd9a835e46a547c316b9a021aafd0d0e715bbfe"} Oct 01 13:18:04 crc kubenswrapper[4851]: I1001 13:18:04.012170 4851 generic.go:334] "Generic (PLEG): container finished" podID="9eaff7ea-f90a-4cfa-8850-cf308591ca11" containerID="c51a6cc2fc8cb591576d8f310a01b22116bd03141078be0b4fd6f4d8d351a25f" exitCode=0 Oct 01 13:18:04 crc kubenswrapper[4851]: I1001 13:18:04.012265 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" event={"ID":"9eaff7ea-f90a-4cfa-8850-cf308591ca11","Type":"ContainerDied","Data":"c51a6cc2fc8cb591576d8f310a01b22116bd03141078be0b4fd6f4d8d351a25f"} Oct 01 13:18:04 crc kubenswrapper[4851]: I1001 13:18:04.017454 4851 generic.go:334] "Generic (PLEG): container finished" podID="9a542b47-9b01-40aa-ab73-d6067689a6d7" containerID="7607b0678a6e50b5b805d97c8cd9a835e46a547c316b9a021aafd0d0e715bbfe" exitCode=0 Oct 01 13:18:04 crc kubenswrapper[4851]: I1001 13:18:04.017566 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfqdl" event={"ID":"9a542b47-9b01-40aa-ab73-d6067689a6d7","Type":"ContainerDied","Data":"7607b0678a6e50b5b805d97c8cd9a835e46a547c316b9a021aafd0d0e715bbfe"} Oct 01 13:18:05 crc kubenswrapper[4851]: I1001 13:18:05.038844 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfqdl" event={"ID":"9a542b47-9b01-40aa-ab73-d6067689a6d7","Type":"ContainerStarted","Data":"db733ab4ef7ddd3b33e13e5892a85fceb44d75ac2346d5cdfa122ed2388b57e3"} Oct 01 13:18:05 crc kubenswrapper[4851]: I1001 13:18:05.071928 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wfqdl" podStartSLOduration=2.441102282 podStartE2EDuration="11.071899868s" podCreationTimestamp="2025-10-01 13:17:54 +0000 UTC" firstStartedPulling="2025-10-01 13:17:55.872787109 +0000 UTC m=+1484.217904595" lastFinishedPulling="2025-10-01 13:18:04.503584665 +0000 UTC m=+1492.848702181" observedRunningTime="2025-10-01 13:18:05.063528681 +0000 UTC m=+1493.408646177" watchObservedRunningTime="2025-10-01 13:18:05.071899868 +0000 UTC m=+1493.417017384" Oct 01 13:18:05 crc kubenswrapper[4851]: I1001 13:18:05.073490 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:18:05 crc kubenswrapper[4851]: I1001 13:18:05.073567 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:18:06 crc kubenswrapper[4851]: I1001 13:18:06.165450 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wfqdl" podUID="9a542b47-9b01-40aa-ab73-d6067689a6d7" containerName="registry-server" probeResult="failure" output=< Oct 01 13:18:06 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 13:18:06 crc kubenswrapper[4851]: > Oct 01 13:18:06 crc kubenswrapper[4851]: I1001 13:18:06.192106 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:18:06 crc kubenswrapper[4851]: I1001 13:18:06.332853 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-ssh-key\") pod \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " Oct 01 13:18:06 crc kubenswrapper[4851]: I1001 13:18:06.333003 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-repo-setup-combined-ca-bundle\") pod \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " Oct 01 13:18:06 crc kubenswrapper[4851]: I1001 13:18:06.333113 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-inventory\") pod \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " Oct 01 13:18:06 crc kubenswrapper[4851]: I1001 13:18:06.333133 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw6zk\" (UniqueName: \"kubernetes.io/projected/9eaff7ea-f90a-4cfa-8850-cf308591ca11-kube-api-access-mw6zk\") pod \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\" (UID: \"9eaff7ea-f90a-4cfa-8850-cf308591ca11\") " Oct 01 13:18:06 crc kubenswrapper[4851]: I1001 13:18:06.345050 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eaff7ea-f90a-4cfa-8850-cf308591ca11-kube-api-access-mw6zk" (OuterVolumeSpecName: "kube-api-access-mw6zk") pod "9eaff7ea-f90a-4cfa-8850-cf308591ca11" (UID: "9eaff7ea-f90a-4cfa-8850-cf308591ca11"). InnerVolumeSpecName "kube-api-access-mw6zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:18:06 crc kubenswrapper[4851]: I1001 13:18:06.348290 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9eaff7ea-f90a-4cfa-8850-cf308591ca11" (UID: "9eaff7ea-f90a-4cfa-8850-cf308591ca11"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:18:06 crc kubenswrapper[4851]: I1001 13:18:06.364653 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9eaff7ea-f90a-4cfa-8850-cf308591ca11" (UID: "9eaff7ea-f90a-4cfa-8850-cf308591ca11"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:18:06 crc kubenswrapper[4851]: I1001 13:18:06.378315 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-inventory" (OuterVolumeSpecName: "inventory") pod "9eaff7ea-f90a-4cfa-8850-cf308591ca11" (UID: "9eaff7ea-f90a-4cfa-8850-cf308591ca11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:18:06 crc kubenswrapper[4851]: I1001 13:18:06.438329 4851 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:06 crc kubenswrapper[4851]: I1001 13:18:06.438399 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:06 crc kubenswrapper[4851]: I1001 13:18:06.438431 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw6zk\" (UniqueName: \"kubernetes.io/projected/9eaff7ea-f90a-4cfa-8850-cf308591ca11-kube-api-access-mw6zk\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:06 crc kubenswrapper[4851]: I1001 13:18:06.438456 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9eaff7ea-f90a-4cfa-8850-cf308591ca11-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.069325 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" event={"ID":"9eaff7ea-f90a-4cfa-8850-cf308591ca11","Type":"ContainerDied","Data":"585ab23e6e605b7d4ca24679e346dd205207b3891ab5fe1688ba2ee60f604124"} Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.069808 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="585ab23e6e605b7d4ca24679e346dd205207b3891ab5fe1688ba2ee60f604124" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.069717 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.309387 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6"] Oct 01 13:18:07 crc kubenswrapper[4851]: E1001 13:18:07.309960 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eaff7ea-f90a-4cfa-8850-cf308591ca11" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.309979 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eaff7ea-f90a-4cfa-8850-cf308591ca11" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.310234 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eaff7ea-f90a-4cfa-8850-cf308591ca11" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.311460 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.313928 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.314307 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.314520 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.314567 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.335844 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6"] Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.460636 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c97f93db-5638-4aa6-b20d-98b1602301af-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c2qg6\" (UID: \"c97f93db-5638-4aa6-b20d-98b1602301af\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.460747 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rljch\" (UniqueName: \"kubernetes.io/projected/c97f93db-5638-4aa6-b20d-98b1602301af-kube-api-access-rljch\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c2qg6\" (UID: \"c97f93db-5638-4aa6-b20d-98b1602301af\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.460986 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c97f93db-5638-4aa6-b20d-98b1602301af-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c2qg6\" (UID: \"c97f93db-5638-4aa6-b20d-98b1602301af\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.563069 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c97f93db-5638-4aa6-b20d-98b1602301af-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c2qg6\" (UID: \"c97f93db-5638-4aa6-b20d-98b1602301af\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.563273 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rljch\" (UniqueName: \"kubernetes.io/projected/c97f93db-5638-4aa6-b20d-98b1602301af-kube-api-access-rljch\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c2qg6\" (UID: \"c97f93db-5638-4aa6-b20d-98b1602301af\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.563432 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c97f93db-5638-4aa6-b20d-98b1602301af-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c2qg6\" (UID: \"c97f93db-5638-4aa6-b20d-98b1602301af\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.568261 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c97f93db-5638-4aa6-b20d-98b1602301af-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c2qg6\" (UID: \"c97f93db-5638-4aa6-b20d-98b1602301af\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.571183 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c97f93db-5638-4aa6-b20d-98b1602301af-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c2qg6\" (UID: \"c97f93db-5638-4aa6-b20d-98b1602301af\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.583122 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rljch\" (UniqueName: \"kubernetes.io/projected/c97f93db-5638-4aa6-b20d-98b1602301af-kube-api-access-rljch\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c2qg6\" (UID: \"c97f93db-5638-4aa6-b20d-98b1602301af\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" Oct 01 13:18:07 crc kubenswrapper[4851]: I1001 13:18:07.648132 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" Oct 01 13:18:08 crc kubenswrapper[4851]: I1001 13:18:08.356786 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6"] Oct 01 13:18:09 crc kubenswrapper[4851]: I1001 13:18:09.098223 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" event={"ID":"c97f93db-5638-4aa6-b20d-98b1602301af","Type":"ContainerStarted","Data":"1a7752a7d66618ce4782b9797f7ee7dc126cb10653cf1140000f445801818e60"} Oct 01 13:18:10 crc kubenswrapper[4851]: I1001 13:18:10.112675 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" event={"ID":"c97f93db-5638-4aa6-b20d-98b1602301af","Type":"ContainerStarted","Data":"b7c1627b537fb0e3a5ec9a3376ed2269a56e27781009f7ede5bbeaf48488e4c4"} Oct 01 13:18:10 crc kubenswrapper[4851]: I1001 13:18:10.150520 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" podStartSLOduration=2.006562284 podStartE2EDuration="3.150480287s" podCreationTimestamp="2025-10-01 13:18:07 +0000 UTC" firstStartedPulling="2025-10-01 13:18:08.365937548 +0000 UTC m=+1496.711055034" lastFinishedPulling="2025-10-01 13:18:09.509855501 +0000 UTC m=+1497.854973037" observedRunningTime="2025-10-01 13:18:10.136681815 +0000 UTC m=+1498.481799331" watchObservedRunningTime="2025-10-01 13:18:10.150480287 +0000 UTC m=+1498.495597793" Oct 01 13:18:13 crc kubenswrapper[4851]: I1001 13:18:13.146762 4851 generic.go:334] "Generic (PLEG): container finished" podID="c97f93db-5638-4aa6-b20d-98b1602301af" containerID="b7c1627b537fb0e3a5ec9a3376ed2269a56e27781009f7ede5bbeaf48488e4c4" exitCode=0 Oct 01 13:18:13 crc kubenswrapper[4851]: I1001 13:18:13.146894 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" event={"ID":"c97f93db-5638-4aa6-b20d-98b1602301af","Type":"ContainerDied","Data":"b7c1627b537fb0e3a5ec9a3376ed2269a56e27781009f7ede5bbeaf48488e4c4"} Oct 01 13:18:14 crc kubenswrapper[4851]: I1001 13:18:14.672656 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" Oct 01 13:18:14 crc kubenswrapper[4851]: I1001 13:18:14.834924 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c97f93db-5638-4aa6-b20d-98b1602301af-ssh-key\") pod \"c97f93db-5638-4aa6-b20d-98b1602301af\" (UID: \"c97f93db-5638-4aa6-b20d-98b1602301af\") " Oct 01 13:18:14 crc kubenswrapper[4851]: I1001 13:18:14.835305 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rljch\" (UniqueName: \"kubernetes.io/projected/c97f93db-5638-4aa6-b20d-98b1602301af-kube-api-access-rljch\") pod \"c97f93db-5638-4aa6-b20d-98b1602301af\" (UID: \"c97f93db-5638-4aa6-b20d-98b1602301af\") " Oct 01 13:18:14 crc kubenswrapper[4851]: I1001 13:18:14.835512 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c97f93db-5638-4aa6-b20d-98b1602301af-inventory\") pod \"c97f93db-5638-4aa6-b20d-98b1602301af\" (UID: \"c97f93db-5638-4aa6-b20d-98b1602301af\") " Oct 01 13:18:14 crc kubenswrapper[4851]: I1001 13:18:14.840191 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c97f93db-5638-4aa6-b20d-98b1602301af-kube-api-access-rljch" (OuterVolumeSpecName: "kube-api-access-rljch") pod "c97f93db-5638-4aa6-b20d-98b1602301af" (UID: "c97f93db-5638-4aa6-b20d-98b1602301af"). InnerVolumeSpecName "kube-api-access-rljch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:18:14 crc kubenswrapper[4851]: I1001 13:18:14.863883 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c97f93db-5638-4aa6-b20d-98b1602301af-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c97f93db-5638-4aa6-b20d-98b1602301af" (UID: "c97f93db-5638-4aa6-b20d-98b1602301af"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:18:14 crc kubenswrapper[4851]: I1001 13:18:14.895384 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c97f93db-5638-4aa6-b20d-98b1602301af-inventory" (OuterVolumeSpecName: "inventory") pod "c97f93db-5638-4aa6-b20d-98b1602301af" (UID: "c97f93db-5638-4aa6-b20d-98b1602301af"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:18:14 crc kubenswrapper[4851]: I1001 13:18:14.939067 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c97f93db-5638-4aa6-b20d-98b1602301af-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:14 crc kubenswrapper[4851]: I1001 13:18:14.939102 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rljch\" (UniqueName: \"kubernetes.io/projected/c97f93db-5638-4aa6-b20d-98b1602301af-kube-api-access-rljch\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:14 crc kubenswrapper[4851]: I1001 13:18:14.939117 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c97f93db-5638-4aa6-b20d-98b1602301af-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.053823 4851 scope.go:117] "RemoveContainer" containerID="a732157d5877bbcdd56e964be87aa9441d7057316378fd23aa232f972176764b" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.169661 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" event={"ID":"c97f93db-5638-4aa6-b20d-98b1602301af","Type":"ContainerDied","Data":"1a7752a7d66618ce4782b9797f7ee7dc126cb10653cf1140000f445801818e60"} Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.170226 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a7752a7d66618ce4782b9797f7ee7dc126cb10653cf1140000f445801818e60" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.169711 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c2qg6" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.250525 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf"] Oct 01 13:18:15 crc kubenswrapper[4851]: E1001 13:18:15.250902 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c97f93db-5638-4aa6-b20d-98b1602301af" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.250917 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97f93db-5638-4aa6-b20d-98b1602301af" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.251095 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="c97f93db-5638-4aa6-b20d-98b1602301af" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.251756 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.254704 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.256298 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.256378 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.257166 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.266790 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf"] Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.347186 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n492h\" (UniqueName: \"kubernetes.io/projected/db4056ce-42b4-4853-9e9f-69320e29e5cc-kube-api-access-n492h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.347430 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.347579 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.347677 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.449893 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.450164 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.450397 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n492h\" (UniqueName: \"kubernetes.io/projected/db4056ce-42b4-4853-9e9f-69320e29e5cc-kube-api-access-n492h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.450489 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.455337 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.455610 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.455880 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.470182 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n492h\" (UniqueName: \"kubernetes.io/projected/db4056ce-42b4-4853-9e9f-69320e29e5cc-kube-api-access-n492h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:18:15 crc kubenswrapper[4851]: I1001 13:18:15.567979 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:18:16 crc kubenswrapper[4851]: I1001 13:18:16.121162 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wfqdl" podUID="9a542b47-9b01-40aa-ab73-d6067689a6d7" containerName="registry-server" probeResult="failure" output=< Oct 01 13:18:16 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 13:18:16 crc kubenswrapper[4851]: > Oct 01 13:18:16 crc kubenswrapper[4851]: I1001 13:18:16.169783 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf"] Oct 01 13:18:16 crc kubenswrapper[4851]: I1001 13:18:16.179034 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:18:17 crc kubenswrapper[4851]: I1001 13:18:17.191431 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" event={"ID":"db4056ce-42b4-4853-9e9f-69320e29e5cc","Type":"ContainerStarted","Data":"93983fa72d641fbf2505f640b146cf2829e1e962a811b1eab0eb29cf8a4bc275"} Oct 01 13:18:17 crc kubenswrapper[4851]: I1001 13:18:17.191952 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" event={"ID":"db4056ce-42b4-4853-9e9f-69320e29e5cc","Type":"ContainerStarted","Data":"04bb6ff22dc32dc9c717050e9186d58338ffc86fdf2d4689591166a0826de8ba"} Oct 01 13:18:17 crc kubenswrapper[4851]: I1001 13:18:17.213673 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" podStartSLOduration=1.724449956 podStartE2EDuration="2.213646493s" podCreationTimestamp="2025-10-01 13:18:15 +0000 UTC" firstStartedPulling="2025-10-01 13:18:16.178818316 +0000 UTC m=+1504.523935802" lastFinishedPulling="2025-10-01 13:18:16.668014863 +0000 UTC m=+1505.013132339" observedRunningTime="2025-10-01 13:18:17.209789803 +0000 UTC m=+1505.554907329" watchObservedRunningTime="2025-10-01 13:18:17.213646493 +0000 UTC m=+1505.558764019" Oct 01 13:18:26 crc kubenswrapper[4851]: I1001 13:18:26.159386 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wfqdl" podUID="9a542b47-9b01-40aa-ab73-d6067689a6d7" containerName="registry-server" probeResult="failure" output=< Oct 01 13:18:26 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 13:18:26 crc kubenswrapper[4851]: > Oct 01 13:18:30 crc kubenswrapper[4851]: I1001 13:18:30.050021 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:18:30 crc kubenswrapper[4851]: I1001 13:18:30.050368 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:18:35 crc kubenswrapper[4851]: I1001 13:18:35.146804 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:18:35 crc kubenswrapper[4851]: I1001 13:18:35.217790 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:18:35 crc kubenswrapper[4851]: I1001 13:18:35.408007 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wfqdl"] Oct 01 13:18:36 crc kubenswrapper[4851]: I1001 13:18:36.453969 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wfqdl" podUID="9a542b47-9b01-40aa-ab73-d6067689a6d7" containerName="registry-server" containerID="cri-o://db733ab4ef7ddd3b33e13e5892a85fceb44d75ac2346d5cdfa122ed2388b57e3" gracePeriod=2 Oct 01 13:18:36 crc kubenswrapper[4851]: I1001 13:18:36.954621 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.063029 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcfvr\" (UniqueName: \"kubernetes.io/projected/9a542b47-9b01-40aa-ab73-d6067689a6d7-kube-api-access-fcfvr\") pod \"9a542b47-9b01-40aa-ab73-d6067689a6d7\" (UID: \"9a542b47-9b01-40aa-ab73-d6067689a6d7\") " Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.063073 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a542b47-9b01-40aa-ab73-d6067689a6d7-utilities\") pod \"9a542b47-9b01-40aa-ab73-d6067689a6d7\" (UID: \"9a542b47-9b01-40aa-ab73-d6067689a6d7\") " Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.063212 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a542b47-9b01-40aa-ab73-d6067689a6d7-catalog-content\") pod \"9a542b47-9b01-40aa-ab73-d6067689a6d7\" (UID: \"9a542b47-9b01-40aa-ab73-d6067689a6d7\") " Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.064270 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a542b47-9b01-40aa-ab73-d6067689a6d7-utilities" (OuterVolumeSpecName: "utilities") pod "9a542b47-9b01-40aa-ab73-d6067689a6d7" (UID: "9a542b47-9b01-40aa-ab73-d6067689a6d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.068920 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a542b47-9b01-40aa-ab73-d6067689a6d7-kube-api-access-fcfvr" (OuterVolumeSpecName: "kube-api-access-fcfvr") pod "9a542b47-9b01-40aa-ab73-d6067689a6d7" (UID: "9a542b47-9b01-40aa-ab73-d6067689a6d7"). InnerVolumeSpecName "kube-api-access-fcfvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.154845 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a542b47-9b01-40aa-ab73-d6067689a6d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a542b47-9b01-40aa-ab73-d6067689a6d7" (UID: "9a542b47-9b01-40aa-ab73-d6067689a6d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.165831 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a542b47-9b01-40aa-ab73-d6067689a6d7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.166036 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcfvr\" (UniqueName: \"kubernetes.io/projected/9a542b47-9b01-40aa-ab73-d6067689a6d7-kube-api-access-fcfvr\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.166100 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a542b47-9b01-40aa-ab73-d6067689a6d7-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.464300 4851 generic.go:334] "Generic (PLEG): container finished" podID="9a542b47-9b01-40aa-ab73-d6067689a6d7" containerID="db733ab4ef7ddd3b33e13e5892a85fceb44d75ac2346d5cdfa122ed2388b57e3" exitCode=0 Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.464358 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wfqdl" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.464371 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfqdl" event={"ID":"9a542b47-9b01-40aa-ab73-d6067689a6d7","Type":"ContainerDied","Data":"db733ab4ef7ddd3b33e13e5892a85fceb44d75ac2346d5cdfa122ed2388b57e3"} Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.465625 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfqdl" event={"ID":"9a542b47-9b01-40aa-ab73-d6067689a6d7","Type":"ContainerDied","Data":"352280349a1f4e433455973bc45e3de1108afe6ece9fe652e951f6e9adedcb73"} Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.465701 4851 scope.go:117] "RemoveContainer" containerID="db733ab4ef7ddd3b33e13e5892a85fceb44d75ac2346d5cdfa122ed2388b57e3" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.505109 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wfqdl"] Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.509318 4851 scope.go:117] "RemoveContainer" containerID="7607b0678a6e50b5b805d97c8cd9a835e46a547c316b9a021aafd0d0e715bbfe" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.513981 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wfqdl"] Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.536347 4851 scope.go:117] "RemoveContainer" containerID="2da09483c9fdb31d1fd07a911e4557524c1a177af5bbb5a2bb4e63420f5f6a73" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.613033 4851 scope.go:117] "RemoveContainer" containerID="db733ab4ef7ddd3b33e13e5892a85fceb44d75ac2346d5cdfa122ed2388b57e3" Oct 01 13:18:37 crc kubenswrapper[4851]: E1001 13:18:37.613528 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db733ab4ef7ddd3b33e13e5892a85fceb44d75ac2346d5cdfa122ed2388b57e3\": container with ID starting with db733ab4ef7ddd3b33e13e5892a85fceb44d75ac2346d5cdfa122ed2388b57e3 not found: ID does not exist" containerID="db733ab4ef7ddd3b33e13e5892a85fceb44d75ac2346d5cdfa122ed2388b57e3" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.613570 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db733ab4ef7ddd3b33e13e5892a85fceb44d75ac2346d5cdfa122ed2388b57e3"} err="failed to get container status \"db733ab4ef7ddd3b33e13e5892a85fceb44d75ac2346d5cdfa122ed2388b57e3\": rpc error: code = NotFound desc = could not find container \"db733ab4ef7ddd3b33e13e5892a85fceb44d75ac2346d5cdfa122ed2388b57e3\": container with ID starting with db733ab4ef7ddd3b33e13e5892a85fceb44d75ac2346d5cdfa122ed2388b57e3 not found: ID does not exist" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.613596 4851 scope.go:117] "RemoveContainer" containerID="7607b0678a6e50b5b805d97c8cd9a835e46a547c316b9a021aafd0d0e715bbfe" Oct 01 13:18:37 crc kubenswrapper[4851]: E1001 13:18:37.614073 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7607b0678a6e50b5b805d97c8cd9a835e46a547c316b9a021aafd0d0e715bbfe\": container with ID starting with 7607b0678a6e50b5b805d97c8cd9a835e46a547c316b9a021aafd0d0e715bbfe not found: ID does not exist" containerID="7607b0678a6e50b5b805d97c8cd9a835e46a547c316b9a021aafd0d0e715bbfe" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.614100 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7607b0678a6e50b5b805d97c8cd9a835e46a547c316b9a021aafd0d0e715bbfe"} err="failed to get container status \"7607b0678a6e50b5b805d97c8cd9a835e46a547c316b9a021aafd0d0e715bbfe\": rpc error: code = NotFound desc = could not find container \"7607b0678a6e50b5b805d97c8cd9a835e46a547c316b9a021aafd0d0e715bbfe\": container with ID starting with 7607b0678a6e50b5b805d97c8cd9a835e46a547c316b9a021aafd0d0e715bbfe not found: ID does not exist" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.614113 4851 scope.go:117] "RemoveContainer" containerID="2da09483c9fdb31d1fd07a911e4557524c1a177af5bbb5a2bb4e63420f5f6a73" Oct 01 13:18:37 crc kubenswrapper[4851]: E1001 13:18:37.614446 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2da09483c9fdb31d1fd07a911e4557524c1a177af5bbb5a2bb4e63420f5f6a73\": container with ID starting with 2da09483c9fdb31d1fd07a911e4557524c1a177af5bbb5a2bb4e63420f5f6a73 not found: ID does not exist" containerID="2da09483c9fdb31d1fd07a911e4557524c1a177af5bbb5a2bb4e63420f5f6a73" Oct 01 13:18:37 crc kubenswrapper[4851]: I1001 13:18:37.614520 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2da09483c9fdb31d1fd07a911e4557524c1a177af5bbb5a2bb4e63420f5f6a73"} err="failed to get container status \"2da09483c9fdb31d1fd07a911e4557524c1a177af5bbb5a2bb4e63420f5f6a73\": rpc error: code = NotFound desc = could not find container \"2da09483c9fdb31d1fd07a911e4557524c1a177af5bbb5a2bb4e63420f5f6a73\": container with ID starting with 2da09483c9fdb31d1fd07a911e4557524c1a177af5bbb5a2bb4e63420f5f6a73 not found: ID does not exist" Oct 01 13:18:38 crc kubenswrapper[4851]: I1001 13:18:38.344960 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a542b47-9b01-40aa-ab73-d6067689a6d7" path="/var/lib/kubelet/pods/9a542b47-9b01-40aa-ab73-d6067689a6d7/volumes" Oct 01 13:19:00 crc kubenswrapper[4851]: I1001 13:19:00.050035 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:19:00 crc kubenswrapper[4851]: I1001 13:19:00.051046 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:19:15 crc kubenswrapper[4851]: I1001 13:19:15.183886 4851 scope.go:117] "RemoveContainer" containerID="3ce8c3f4069070fb48b679c33367c42a6c287375a7b2ae6b3ccf0645c2683bef" Oct 01 13:19:30 crc kubenswrapper[4851]: I1001 13:19:30.050877 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:19:30 crc kubenswrapper[4851]: I1001 13:19:30.051711 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:19:30 crc kubenswrapper[4851]: I1001 13:19:30.051787 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 13:19:30 crc kubenswrapper[4851]: I1001 13:19:30.052984 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:19:30 crc kubenswrapper[4851]: I1001 13:19:30.053115 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" gracePeriod=600 Oct 01 13:19:30 crc kubenswrapper[4851]: E1001 13:19:30.177755 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:19:31 crc kubenswrapper[4851]: I1001 13:19:31.173218 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" exitCode=0 Oct 01 13:19:31 crc kubenswrapper[4851]: I1001 13:19:31.173282 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7"} Oct 01 13:19:31 crc kubenswrapper[4851]: I1001 13:19:31.173806 4851 scope.go:117] "RemoveContainer" containerID="45376e593b7f479231d2d5e58c334337acd5d47c4a95ca6e3b37f5047096d591" Oct 01 13:19:31 crc kubenswrapper[4851]: I1001 13:19:31.174578 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:19:31 crc kubenswrapper[4851]: E1001 13:19:31.175163 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:19:42 crc kubenswrapper[4851]: I1001 13:19:42.342161 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:19:42 crc kubenswrapper[4851]: E1001 13:19:42.342863 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:19:55 crc kubenswrapper[4851]: I1001 13:19:55.328837 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:19:55 crc kubenswrapper[4851]: E1001 13:19:55.329516 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:20:08 crc kubenswrapper[4851]: I1001 13:20:08.328937 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:20:08 crc kubenswrapper[4851]: E1001 13:20:08.330643 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:20:15 crc kubenswrapper[4851]: I1001 13:20:15.288311 4851 scope.go:117] "RemoveContainer" containerID="80fb442d0f64203d1cafcc4a1acef5d34a900b62f1b35fdbde40e0bb597dab56" Oct 01 13:20:15 crc kubenswrapper[4851]: I1001 13:20:15.323336 4851 scope.go:117] "RemoveContainer" containerID="0c780ef1d09c95cce3974f0e90e1b5c259f6f24de061c7126ed0c371da4430aa" Oct 01 13:20:15 crc kubenswrapper[4851]: I1001 13:20:15.354285 4851 scope.go:117] "RemoveContainer" containerID="6f68efdbd0237b97cf6a4c1fe0a1b4e696696562807dac83f5933afc5b69b61f" Oct 01 13:20:15 crc kubenswrapper[4851]: I1001 13:20:15.384270 4851 scope.go:117] "RemoveContainer" containerID="13044738ec35137f45762e16074e74ac068396b8c20c1d1ae74ad1216f8edbad" Oct 01 13:20:22 crc kubenswrapper[4851]: I1001 13:20:22.344181 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:20:22 crc kubenswrapper[4851]: E1001 13:20:22.345388 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:20:37 crc kubenswrapper[4851]: I1001 13:20:37.328914 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:20:37 crc kubenswrapper[4851]: E1001 13:20:37.330123 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:20:48 crc kubenswrapper[4851]: I1001 13:20:48.328674 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:20:48 crc kubenswrapper[4851]: E1001 13:20:48.329576 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:21:02 crc kubenswrapper[4851]: I1001 13:21:02.344367 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:21:02 crc kubenswrapper[4851]: E1001 13:21:02.345686 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:21:16 crc kubenswrapper[4851]: I1001 13:21:16.049832 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-wp2hj"] Oct 01 13:21:16 crc kubenswrapper[4851]: I1001 13:21:16.060507 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-wp2hj"] Oct 01 13:21:16 crc kubenswrapper[4851]: I1001 13:21:16.329986 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:21:16 crc kubenswrapper[4851]: E1001 13:21:16.330790 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:21:16 crc kubenswrapper[4851]: I1001 13:21:16.342917 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6493c0f-dc39-4b1b-a93e-e8a4bf323729" path="/var/lib/kubelet/pods/e6493c0f-dc39-4b1b-a93e-e8a4bf323729/volumes" Oct 01 13:21:20 crc kubenswrapper[4851]: I1001 13:21:20.040117 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-swbm8"] Oct 01 13:21:20 crc kubenswrapper[4851]: I1001 13:21:20.054158 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-swbm8"] Oct 01 13:21:20 crc kubenswrapper[4851]: I1001 13:21:20.343540 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="891a835a-a012-4d76-a61f-393fbef5692f" path="/var/lib/kubelet/pods/891a835a-a012-4d76-a61f-393fbef5692f/volumes" Oct 01 13:21:26 crc kubenswrapper[4851]: I1001 13:21:26.077265 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-qtxhk"] Oct 01 13:21:26 crc kubenswrapper[4851]: I1001 13:21:26.088189 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-dkxdn"] Oct 01 13:21:26 crc kubenswrapper[4851]: I1001 13:21:26.098082 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-qtxhk"] Oct 01 13:21:26 crc kubenswrapper[4851]: I1001 13:21:26.105631 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-dkxdn"] Oct 01 13:21:26 crc kubenswrapper[4851]: I1001 13:21:26.338990 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="079f3beb-593f-4866-ae54-8e1fb880c9b1" path="/var/lib/kubelet/pods/079f3beb-593f-4866-ae54-8e1fb880c9b1/volumes" Oct 01 13:21:26 crc kubenswrapper[4851]: I1001 13:21:26.340568 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef457f3c-0b63-4180-a5a7-bbf92e55a195" path="/var/lib/kubelet/pods/ef457f3c-0b63-4180-a5a7-bbf92e55a195/volumes" Oct 01 13:21:27 crc kubenswrapper[4851]: I1001 13:21:27.044858 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-d2a4-account-create-gvkcm"] Oct 01 13:21:27 crc kubenswrapper[4851]: I1001 13:21:27.060066 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-d2a4-account-create-gvkcm"] Oct 01 13:21:28 crc kubenswrapper[4851]: I1001 13:21:28.346826 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b8bb329-e08f-415d-a643-94cffe80471b" path="/var/lib/kubelet/pods/6b8bb329-e08f-415d-a643-94cffe80471b/volumes" Oct 01 13:21:29 crc kubenswrapper[4851]: I1001 13:21:29.328883 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:21:29 crc kubenswrapper[4851]: E1001 13:21:29.329234 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:21:39 crc kubenswrapper[4851]: I1001 13:21:39.054211 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-32f8-account-create-7mdj5"] Oct 01 13:21:39 crc kubenswrapper[4851]: I1001 13:21:39.073352 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-32f8-account-create-7mdj5"] Oct 01 13:21:40 crc kubenswrapper[4851]: I1001 13:21:40.339017 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d8621f-665c-42d4-81bd-43afeadc7b7d" path="/var/lib/kubelet/pods/45d8621f-665c-42d4-81bd-43afeadc7b7d/volumes" Oct 01 13:21:42 crc kubenswrapper[4851]: I1001 13:21:42.337399 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:21:42 crc kubenswrapper[4851]: E1001 13:21:42.338882 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:21:44 crc kubenswrapper[4851]: I1001 13:21:44.051646 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-96c5-account-create-rqjrx"] Oct 01 13:21:44 crc kubenswrapper[4851]: I1001 13:21:44.062674 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8535-account-create-j68t5"] Oct 01 13:21:44 crc kubenswrapper[4851]: I1001 13:21:44.073614 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-96c5-account-create-rqjrx"] Oct 01 13:21:44 crc kubenswrapper[4851]: I1001 13:21:44.080809 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8535-account-create-j68t5"] Oct 01 13:21:44 crc kubenswrapper[4851]: I1001 13:21:44.346772 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="700c50aa-a63d-44ea-8c51-4b76771caf18" path="/var/lib/kubelet/pods/700c50aa-a63d-44ea-8c51-4b76771caf18/volumes" Oct 01 13:21:44 crc kubenswrapper[4851]: I1001 13:21:44.347490 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a661d676-d45b-4cf6-8b29-31c28fb2c46b" path="/var/lib/kubelet/pods/a661d676-d45b-4cf6-8b29-31c28fb2c46b/volumes" Oct 01 13:21:57 crc kubenswrapper[4851]: I1001 13:21:57.328874 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:21:57 crc kubenswrapper[4851]: E1001 13:21:57.330034 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:22:04 crc kubenswrapper[4851]: I1001 13:22:04.010357 4851 generic.go:334] "Generic (PLEG): container finished" podID="db4056ce-42b4-4853-9e9f-69320e29e5cc" containerID="93983fa72d641fbf2505f640b146cf2829e1e962a811b1eab0eb29cf8a4bc275" exitCode=0 Oct 01 13:22:04 crc kubenswrapper[4851]: I1001 13:22:04.010483 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" event={"ID":"db4056ce-42b4-4853-9e9f-69320e29e5cc","Type":"ContainerDied","Data":"93983fa72d641fbf2505f640b146cf2829e1e962a811b1eab0eb29cf8a4bc275"} Oct 01 13:22:05 crc kubenswrapper[4851]: I1001 13:22:05.527494 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:22:05 crc kubenswrapper[4851]: I1001 13:22:05.687832 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n492h\" (UniqueName: \"kubernetes.io/projected/db4056ce-42b4-4853-9e9f-69320e29e5cc-kube-api-access-n492h\") pod \"db4056ce-42b4-4853-9e9f-69320e29e5cc\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " Oct 01 13:22:05 crc kubenswrapper[4851]: I1001 13:22:05.687880 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-ssh-key\") pod \"db4056ce-42b4-4853-9e9f-69320e29e5cc\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " Oct 01 13:22:05 crc kubenswrapper[4851]: I1001 13:22:05.687930 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-bootstrap-combined-ca-bundle\") pod \"db4056ce-42b4-4853-9e9f-69320e29e5cc\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " Oct 01 13:22:05 crc kubenswrapper[4851]: I1001 13:22:05.688004 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-inventory\") pod \"db4056ce-42b4-4853-9e9f-69320e29e5cc\" (UID: \"db4056ce-42b4-4853-9e9f-69320e29e5cc\") " Oct 01 13:22:05 crc kubenswrapper[4851]: I1001 13:22:05.694929 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "db4056ce-42b4-4853-9e9f-69320e29e5cc" (UID: "db4056ce-42b4-4853-9e9f-69320e29e5cc"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:05 crc kubenswrapper[4851]: I1001 13:22:05.695758 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db4056ce-42b4-4853-9e9f-69320e29e5cc-kube-api-access-n492h" (OuterVolumeSpecName: "kube-api-access-n492h") pod "db4056ce-42b4-4853-9e9f-69320e29e5cc" (UID: "db4056ce-42b4-4853-9e9f-69320e29e5cc"). InnerVolumeSpecName "kube-api-access-n492h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:22:05 crc kubenswrapper[4851]: I1001 13:22:05.723156 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-inventory" (OuterVolumeSpecName: "inventory") pod "db4056ce-42b4-4853-9e9f-69320e29e5cc" (UID: "db4056ce-42b4-4853-9e9f-69320e29e5cc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:05 crc kubenswrapper[4851]: I1001 13:22:05.739337 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "db4056ce-42b4-4853-9e9f-69320e29e5cc" (UID: "db4056ce-42b4-4853-9e9f-69320e29e5cc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:05 crc kubenswrapper[4851]: I1001 13:22:05.790062 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n492h\" (UniqueName: \"kubernetes.io/projected/db4056ce-42b4-4853-9e9f-69320e29e5cc-kube-api-access-n492h\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:05 crc kubenswrapper[4851]: I1001 13:22:05.790108 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:05 crc kubenswrapper[4851]: I1001 13:22:05.790128 4851 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:05 crc kubenswrapper[4851]: I1001 13:22:05.790148 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db4056ce-42b4-4853-9e9f-69320e29e5cc-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.040176 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" event={"ID":"db4056ce-42b4-4853-9e9f-69320e29e5cc","Type":"ContainerDied","Data":"04bb6ff22dc32dc9c717050e9186d58338ffc86fdf2d4689591166a0826de8ba"} Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.040492 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04bb6ff22dc32dc9c717050e9186d58338ffc86fdf2d4689591166a0826de8ba" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.040253 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.172473 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c"] Oct 01 13:22:06 crc kubenswrapper[4851]: E1001 13:22:06.173056 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4056ce-42b4-4853-9e9f-69320e29e5cc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.173071 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4056ce-42b4-4853-9e9f-69320e29e5cc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:22:06 crc kubenswrapper[4851]: E1001 13:22:06.173088 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a542b47-9b01-40aa-ab73-d6067689a6d7" containerName="registry-server" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.173094 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a542b47-9b01-40aa-ab73-d6067689a6d7" containerName="registry-server" Oct 01 13:22:06 crc kubenswrapper[4851]: E1001 13:22:06.173115 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a542b47-9b01-40aa-ab73-d6067689a6d7" containerName="extract-content" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.173121 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a542b47-9b01-40aa-ab73-d6067689a6d7" containerName="extract-content" Oct 01 13:22:06 crc kubenswrapper[4851]: E1001 13:22:06.173133 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a542b47-9b01-40aa-ab73-d6067689a6d7" containerName="extract-utilities" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.173139 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a542b47-9b01-40aa-ab73-d6067689a6d7" containerName="extract-utilities" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.173324 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a542b47-9b01-40aa-ab73-d6067689a6d7" containerName="registry-server" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.173339 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="db4056ce-42b4-4853-9e9f-69320e29e5cc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.174070 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.177151 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.177465 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.177545 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.178768 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.205400 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c"] Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.307891 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pp46c\" (UID: \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.308090 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pp46c\" (UID: \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.308199 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlwvf\" (UniqueName: \"kubernetes.io/projected/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-kube-api-access-zlwvf\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pp46c\" (UID: \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.410356 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pp46c\" (UID: \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.410702 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pp46c\" (UID: \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.410835 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlwvf\" (UniqueName: \"kubernetes.io/projected/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-kube-api-access-zlwvf\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pp46c\" (UID: \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.418155 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pp46c\" (UID: \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.431633 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pp46c\" (UID: \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.442575 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlwvf\" (UniqueName: \"kubernetes.io/projected/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-kube-api-access-zlwvf\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pp46c\" (UID: \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" Oct 01 13:22:06 crc kubenswrapper[4851]: I1001 13:22:06.507242 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" Oct 01 13:22:07 crc kubenswrapper[4851]: I1001 13:22:07.045970 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-lbkjz"] Oct 01 13:22:07 crc kubenswrapper[4851]: I1001 13:22:07.065780 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-xg8vg"] Oct 01 13:22:07 crc kubenswrapper[4851]: I1001 13:22:07.080214 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-lbkjz"] Oct 01 13:22:07 crc kubenswrapper[4851]: I1001 13:22:07.092664 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-xg8vg"] Oct 01 13:22:07 crc kubenswrapper[4851]: I1001 13:22:07.119205 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c"] Oct 01 13:22:08 crc kubenswrapper[4851]: I1001 13:22:08.068381 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" event={"ID":"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b","Type":"ContainerStarted","Data":"2b93659ec32ed9e9ce9fa20004013a3b4212fcd161f24b3c49b5b1ef7b8605b6"} Oct 01 13:22:08 crc kubenswrapper[4851]: I1001 13:22:08.068941 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" event={"ID":"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b","Type":"ContainerStarted","Data":"60b9023a08c85fdfcedefaf0b6d1242ecefcd9386480d71f8a867aea5ebbc82e"} Oct 01 13:22:08 crc kubenswrapper[4851]: I1001 13:22:08.089120 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" podStartSLOduration=1.476804481 podStartE2EDuration="2.089098786s" podCreationTimestamp="2025-10-01 13:22:06 +0000 UTC" firstStartedPulling="2025-10-01 13:22:07.12565754 +0000 UTC m=+1735.470775026" lastFinishedPulling="2025-10-01 13:22:07.737951845 +0000 UTC m=+1736.083069331" observedRunningTime="2025-10-01 13:22:08.080737887 +0000 UTC m=+1736.425855393" watchObservedRunningTime="2025-10-01 13:22:08.089098786 +0000 UTC m=+1736.434216272" Oct 01 13:22:08 crc kubenswrapper[4851]: I1001 13:22:08.329394 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:22:08 crc kubenswrapper[4851]: E1001 13:22:08.329916 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:22:08 crc kubenswrapper[4851]: I1001 13:22:08.349564 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c23b6f-4490-4676-ae47-dfdf989c1d4b" path="/var/lib/kubelet/pods/26c23b6f-4490-4676-ae47-dfdf989c1d4b/volumes" Oct 01 13:22:08 crc kubenswrapper[4851]: I1001 13:22:08.351069 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8779235-9b2e-48ec-a423-78de131a5da1" path="/var/lib/kubelet/pods/e8779235-9b2e-48ec-a423-78de131a5da1/volumes" Oct 01 13:22:12 crc kubenswrapper[4851]: I1001 13:22:12.048417 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-2dvwq"] Oct 01 13:22:12 crc kubenswrapper[4851]: I1001 13:22:12.060401 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-2dvwq"] Oct 01 13:22:12 crc kubenswrapper[4851]: I1001 13:22:12.350729 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8efa1df-96f3-4954-881a-bff1284f380b" path="/var/lib/kubelet/pods/a8efa1df-96f3-4954-881a-bff1284f380b/volumes" Oct 01 13:22:15 crc kubenswrapper[4851]: I1001 13:22:15.484218 4851 scope.go:117] "RemoveContainer" containerID="6ad311226be69c015b67fa6f59649d1455561a1038bad5aa7727e1d0a0157490" Oct 01 13:22:15 crc kubenswrapper[4851]: I1001 13:22:15.537828 4851 scope.go:117] "RemoveContainer" containerID="8b223680f1ef1c92742587cc02f36c5e1b66ce8fa77114d8927959d98d39143a" Oct 01 13:22:15 crc kubenswrapper[4851]: I1001 13:22:15.601375 4851 scope.go:117] "RemoveContainer" containerID="806a9baf4982bba0142b95d4e13a60fadb66cba456b3ea3252115f3cc6877ede" Oct 01 13:22:15 crc kubenswrapper[4851]: I1001 13:22:15.643310 4851 scope.go:117] "RemoveContainer" containerID="14d7041e98802f4e82c687617bf7cff5eaf16695c26dc2603dbc83300f24e6a0" Oct 01 13:22:15 crc kubenswrapper[4851]: I1001 13:22:15.710613 4851 scope.go:117] "RemoveContainer" containerID="bb6012633397d49cb42e2d31280d52b95ec2197e87f5deba1901222f8609e405" Oct 01 13:22:15 crc kubenswrapper[4851]: I1001 13:22:15.776483 4851 scope.go:117] "RemoveContainer" containerID="edb64889e6f0e00f3865e72a0370bb34e56ec00fabef5cbe0effb40b0261a687" Oct 01 13:22:15 crc kubenswrapper[4851]: I1001 13:22:15.807097 4851 scope.go:117] "RemoveContainer" containerID="486b3f2726dcbf09db1c1805cbe67352db45f6f7e4876ae95bd5ee9e936d63fc" Oct 01 13:22:15 crc kubenswrapper[4851]: I1001 13:22:15.839493 4851 scope.go:117] "RemoveContainer" containerID="7e57ccb5ac69cee39e1b211bb737f3c602e5cdf9e62fc6623cceb2819dfceee1" Oct 01 13:22:15 crc kubenswrapper[4851]: I1001 13:22:15.864423 4851 scope.go:117] "RemoveContainer" containerID="0f567ca023d9a1094a3c2b1cea88bd2a96b1d3ae1079ddbe2b798507a43c05de" Oct 01 13:22:15 crc kubenswrapper[4851]: I1001 13:22:15.889271 4851 scope.go:117] "RemoveContainer" containerID="9f34893d354cd86fa202b84bf53091ae4b85961dfc88e0f14e5b5d191399abf3" Oct 01 13:22:15 crc kubenswrapper[4851]: I1001 13:22:15.917384 4851 scope.go:117] "RemoveContainer" containerID="33a47a2a8e2103d0d75691444679b66cb4cf7725a8330a5c4223416f51968e90" Oct 01 13:22:22 crc kubenswrapper[4851]: I1001 13:22:22.047377 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-7pnhj"] Oct 01 13:22:22 crc kubenswrapper[4851]: I1001 13:22:22.062316 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-aacf-account-create-dsc7w"] Oct 01 13:22:22 crc kubenswrapper[4851]: I1001 13:22:22.073103 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6405-account-create-9stkn"] Oct 01 13:22:22 crc kubenswrapper[4851]: I1001 13:22:22.086746 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-7pnhj"] Oct 01 13:22:22 crc kubenswrapper[4851]: I1001 13:22:22.100103 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-aacf-account-create-dsc7w"] Oct 01 13:22:22 crc kubenswrapper[4851]: I1001 13:22:22.113588 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-6405-account-create-9stkn"] Oct 01 13:22:22 crc kubenswrapper[4851]: I1001 13:22:22.345337 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89d609db-765e-4137-8ed4-5364d359b402" path="/var/lib/kubelet/pods/89d609db-765e-4137-8ed4-5364d359b402/volumes" Oct 01 13:22:22 crc kubenswrapper[4851]: I1001 13:22:22.346562 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeff4acd-ae99-4646-a186-6bdfb431a52d" path="/var/lib/kubelet/pods/eeff4acd-ae99-4646-a186-6bdfb431a52d/volumes" Oct 01 13:22:22 crc kubenswrapper[4851]: I1001 13:22:22.347936 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc" path="/var/lib/kubelet/pods/f69aa5c0-2e72-4563-a8f8-5915b4bd0dbc/volumes" Oct 01 13:22:23 crc kubenswrapper[4851]: I1001 13:22:23.328167 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:22:23 crc kubenswrapper[4851]: E1001 13:22:23.328903 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:22:24 crc kubenswrapper[4851]: I1001 13:22:24.034175 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-02e4-account-create-wtbvx"] Oct 01 13:22:24 crc kubenswrapper[4851]: I1001 13:22:24.048672 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hgsvg"] Oct 01 13:22:24 crc kubenswrapper[4851]: I1001 13:22:24.058963 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hgsvg"] Oct 01 13:22:24 crc kubenswrapper[4851]: I1001 13:22:24.068164 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-02e4-account-create-wtbvx"] Oct 01 13:22:24 crc kubenswrapper[4851]: I1001 13:22:24.341998 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="794e875d-440e-4a04-acf1-61b1e63c57d7" path="/var/lib/kubelet/pods/794e875d-440e-4a04-acf1-61b1e63c57d7/volumes" Oct 01 13:22:24 crc kubenswrapper[4851]: I1001 13:22:24.343351 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4" path="/var/lib/kubelet/pods/d1c1cc47-d35e-4a07-9972-0c7dd4c0d7a4/volumes" Oct 01 13:22:36 crc kubenswrapper[4851]: I1001 13:22:36.328717 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:22:36 crc kubenswrapper[4851]: E1001 13:22:36.329830 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:22:49 crc kubenswrapper[4851]: I1001 13:22:49.329129 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:22:49 crc kubenswrapper[4851]: E1001 13:22:49.330504 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:22:54 crc kubenswrapper[4851]: I1001 13:22:54.065914 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-f4s7j"] Oct 01 13:22:54 crc kubenswrapper[4851]: I1001 13:22:54.081162 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-f4s7j"] Oct 01 13:22:54 crc kubenswrapper[4851]: I1001 13:22:54.347269 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="384e4797-6339-4742-9430-e87739c80e74" path="/var/lib/kubelet/pods/384e4797-6339-4742-9430-e87739c80e74/volumes" Oct 01 13:23:00 crc kubenswrapper[4851]: I1001 13:23:00.330492 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:23:00 crc kubenswrapper[4851]: E1001 13:23:00.332625 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:23:12 crc kubenswrapper[4851]: I1001 13:23:12.342601 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:23:12 crc kubenswrapper[4851]: E1001 13:23:12.344000 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:23:16 crc kubenswrapper[4851]: I1001 13:23:16.242605 4851 scope.go:117] "RemoveContainer" containerID="a503b22708c0c59620b22b0206a1ea38fdd6791290d8c38aadebbcc38398a25b" Oct 01 13:23:16 crc kubenswrapper[4851]: I1001 13:23:16.281255 4851 scope.go:117] "RemoveContainer" containerID="6ca0ce32e81fb1f415203061c017037cb28ba1c7963705272fb9f329d843af73" Oct 01 13:23:16 crc kubenswrapper[4851]: I1001 13:23:16.355209 4851 scope.go:117] "RemoveContainer" containerID="025dbe9a776e103362c1718f705a3118f52d486ce84bd252907be0b21419bb46" Oct 01 13:23:16 crc kubenswrapper[4851]: I1001 13:23:16.387982 4851 scope.go:117] "RemoveContainer" containerID="60b5959899e23c4f3fcb65a880fc3c306659793dcf421d6cb46dc8f1329ad197" Oct 01 13:23:16 crc kubenswrapper[4851]: I1001 13:23:16.437477 4851 scope.go:117] "RemoveContainer" containerID="0896d580957f4080564c45aaf5cd9e3ed1e67e84ae1881f043ae0f1f24723a8e" Oct 01 13:23:16 crc kubenswrapper[4851]: I1001 13:23:16.481553 4851 scope.go:117] "RemoveContainer" containerID="cb3639292db6b2c3698abda9b74a5ce1a47634af99a24fa9daf627631532a5d7" Oct 01 13:23:20 crc kubenswrapper[4851]: I1001 13:23:20.055488 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5s4gk"] Oct 01 13:23:20 crc kubenswrapper[4851]: I1001 13:23:20.067568 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5s4gk"] Oct 01 13:23:20 crc kubenswrapper[4851]: I1001 13:23:20.343540 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9adae7cd-3c60-49d2-9048-138c3050d6f4" path="/var/lib/kubelet/pods/9adae7cd-3c60-49d2-9048-138c3050d6f4/volumes" Oct 01 13:23:23 crc kubenswrapper[4851]: I1001 13:23:23.044238 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-gk28q"] Oct 01 13:23:23 crc kubenswrapper[4851]: I1001 13:23:23.057876 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-gk28q"] Oct 01 13:23:24 crc kubenswrapper[4851]: I1001 13:23:24.344788 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1eb277b-5338-4d54-939f-6d636f34d14c" path="/var/lib/kubelet/pods/a1eb277b-5338-4d54-939f-6d636f34d14c/volumes" Oct 01 13:23:25 crc kubenswrapper[4851]: I1001 13:23:25.033007 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-l9xpp"] Oct 01 13:23:25 crc kubenswrapper[4851]: I1001 13:23:25.041244 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-l9xpp"] Oct 01 13:23:26 crc kubenswrapper[4851]: I1001 13:23:26.346631 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d64a19f8-04aa-4c82-9818-6a50d9e3d62b" path="/var/lib/kubelet/pods/d64a19f8-04aa-4c82-9818-6a50d9e3d62b/volumes" Oct 01 13:23:27 crc kubenswrapper[4851]: I1001 13:23:27.329002 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:23:27 crc kubenswrapper[4851]: E1001 13:23:27.329308 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:23:40 crc kubenswrapper[4851]: I1001 13:23:40.329083 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:23:40 crc kubenswrapper[4851]: E1001 13:23:40.329911 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:23:44 crc kubenswrapper[4851]: I1001 13:23:44.057902 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9zwxh"] Oct 01 13:23:44 crc kubenswrapper[4851]: I1001 13:23:44.076364 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9zwxh"] Oct 01 13:23:44 crc kubenswrapper[4851]: I1001 13:23:44.342036 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08f27829-7a4b-49d3-aed3-dbae56854228" path="/var/lib/kubelet/pods/08f27829-7a4b-49d3-aed3-dbae56854228/volumes" Oct 01 13:23:52 crc kubenswrapper[4851]: I1001 13:23:52.043868 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fvbs7"] Oct 01 13:23:52 crc kubenswrapper[4851]: I1001 13:23:52.058987 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fvbs7"] Oct 01 13:23:52 crc kubenswrapper[4851]: I1001 13:23:52.342147 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:23:52 crc kubenswrapper[4851]: E1001 13:23:52.343061 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:23:52 crc kubenswrapper[4851]: I1001 13:23:52.349255 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfc29451-e27b-4bd0-9a5f-6a177e7621be" path="/var/lib/kubelet/pods/bfc29451-e27b-4bd0-9a5f-6a177e7621be/volumes" Oct 01 13:23:55 crc kubenswrapper[4851]: I1001 13:23:55.394774 4851 generic.go:334] "Generic (PLEG): container finished" podID="6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b" containerID="2b93659ec32ed9e9ce9fa20004013a3b4212fcd161f24b3c49b5b1ef7b8605b6" exitCode=0 Oct 01 13:23:55 crc kubenswrapper[4851]: I1001 13:23:55.395001 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" event={"ID":"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b","Type":"ContainerDied","Data":"2b93659ec32ed9e9ce9fa20004013a3b4212fcd161f24b3c49b5b1ef7b8605b6"} Oct 01 13:23:56 crc kubenswrapper[4851]: I1001 13:23:56.909737 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.076746 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-inventory\") pod \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\" (UID: \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\") " Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.076881 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlwvf\" (UniqueName: \"kubernetes.io/projected/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-kube-api-access-zlwvf\") pod \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\" (UID: \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\") " Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.076941 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-ssh-key\") pod \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\" (UID: \"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b\") " Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.084053 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-kube-api-access-zlwvf" (OuterVolumeSpecName: "kube-api-access-zlwvf") pod "6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b" (UID: "6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b"). InnerVolumeSpecName "kube-api-access-zlwvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.126210 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-inventory" (OuterVolumeSpecName: "inventory") pod "6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b" (UID: "6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.130786 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b" (UID: "6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.179885 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.179931 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlwvf\" (UniqueName: \"kubernetes.io/projected/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-kube-api-access-zlwvf\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.179950 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.418298 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" event={"ID":"6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b","Type":"ContainerDied","Data":"60b9023a08c85fdfcedefaf0b6d1242ecefcd9386480d71f8a867aea5ebbc82e"} Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.418340 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pp46c" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.418355 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60b9023a08c85fdfcedefaf0b6d1242ecefcd9386480d71f8a867aea5ebbc82e" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.526070 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw"] Oct 01 13:23:57 crc kubenswrapper[4851]: E1001 13:23:57.526675 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.526714 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.527109 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.528309 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.531120 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.531639 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.531649 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.531790 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.536750 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw"] Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.690870 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e073c07-f76e-424b-a1d1-68fcabf7f063-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw\" (UID: \"4e073c07-f76e-424b-a1d1-68fcabf7f063\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.691002 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e073c07-f76e-424b-a1d1-68fcabf7f063-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw\" (UID: \"4e073c07-f76e-424b-a1d1-68fcabf7f063\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.691249 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xszbs\" (UniqueName: \"kubernetes.io/projected/4e073c07-f76e-424b-a1d1-68fcabf7f063-kube-api-access-xszbs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw\" (UID: \"4e073c07-f76e-424b-a1d1-68fcabf7f063\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.793268 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e073c07-f76e-424b-a1d1-68fcabf7f063-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw\" (UID: \"4e073c07-f76e-424b-a1d1-68fcabf7f063\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.793431 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xszbs\" (UniqueName: \"kubernetes.io/projected/4e073c07-f76e-424b-a1d1-68fcabf7f063-kube-api-access-xszbs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw\" (UID: \"4e073c07-f76e-424b-a1d1-68fcabf7f063\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.793709 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e073c07-f76e-424b-a1d1-68fcabf7f063-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw\" (UID: \"4e073c07-f76e-424b-a1d1-68fcabf7f063\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.799448 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e073c07-f76e-424b-a1d1-68fcabf7f063-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw\" (UID: \"4e073c07-f76e-424b-a1d1-68fcabf7f063\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.805132 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e073c07-f76e-424b-a1d1-68fcabf7f063-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw\" (UID: \"4e073c07-f76e-424b-a1d1-68fcabf7f063\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.831216 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xszbs\" (UniqueName: \"kubernetes.io/projected/4e073c07-f76e-424b-a1d1-68fcabf7f063-kube-api-access-xszbs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw\" (UID: \"4e073c07-f76e-424b-a1d1-68fcabf7f063\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" Oct 01 13:23:57 crc kubenswrapper[4851]: I1001 13:23:57.861757 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" Oct 01 13:23:58 crc kubenswrapper[4851]: I1001 13:23:58.424573 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw"] Oct 01 13:23:58 crc kubenswrapper[4851]: I1001 13:23:58.445464 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:23:59 crc kubenswrapper[4851]: I1001 13:23:59.447114 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" event={"ID":"4e073c07-f76e-424b-a1d1-68fcabf7f063","Type":"ContainerStarted","Data":"8bed364f24c100631dadbc1308c55390300820cfd5eeacc30546318a1cb43f9f"} Oct 01 13:23:59 crc kubenswrapper[4851]: I1001 13:23:59.447594 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" event={"ID":"4e073c07-f76e-424b-a1d1-68fcabf7f063","Type":"ContainerStarted","Data":"3688b80c89c5a86203dc125994165f933e2a8f651ac3ded35e28e93f0731dae6"} Oct 01 13:23:59 crc kubenswrapper[4851]: I1001 13:23:59.472693 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" podStartSLOduration=2.000147512 podStartE2EDuration="2.472663243s" podCreationTimestamp="2025-10-01 13:23:57 +0000 UTC" firstStartedPulling="2025-10-01 13:23:58.445044557 +0000 UTC m=+1846.790162083" lastFinishedPulling="2025-10-01 13:23:58.917560328 +0000 UTC m=+1847.262677814" observedRunningTime="2025-10-01 13:23:59.466938559 +0000 UTC m=+1847.812056055" watchObservedRunningTime="2025-10-01 13:23:59.472663243 +0000 UTC m=+1847.817780769" Oct 01 13:24:07 crc kubenswrapper[4851]: I1001 13:24:07.329840 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:24:07 crc kubenswrapper[4851]: E1001 13:24:07.330976 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:24:14 crc kubenswrapper[4851]: I1001 13:24:14.071302 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-sqnxr"] Oct 01 13:24:14 crc kubenswrapper[4851]: I1001 13:24:14.086638 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-cchtl"] Oct 01 13:24:14 crc kubenswrapper[4851]: I1001 13:24:14.098824 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-sqnxr"] Oct 01 13:24:14 crc kubenswrapper[4851]: I1001 13:24:14.112282 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-cchtl"] Oct 01 13:24:14 crc kubenswrapper[4851]: I1001 13:24:14.349188 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eb25592-152d-4ad6-9fcc-46efaee35645" path="/var/lib/kubelet/pods/4eb25592-152d-4ad6-9fcc-46efaee35645/volumes" Oct 01 13:24:14 crc kubenswrapper[4851]: I1001 13:24:14.350271 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="936e4810-725f-42c8-95a2-4e307b5b7af5" path="/var/lib/kubelet/pods/936e4810-725f-42c8-95a2-4e307b5b7af5/volumes" Oct 01 13:24:15 crc kubenswrapper[4851]: I1001 13:24:15.042398 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-sd7q4"] Oct 01 13:24:15 crc kubenswrapper[4851]: I1001 13:24:15.059209 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-sd7q4"] Oct 01 13:24:16 crc kubenswrapper[4851]: I1001 13:24:16.347264 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d926f8f-83f5-470e-9dba-d2b3478d9dae" path="/var/lib/kubelet/pods/3d926f8f-83f5-470e-9dba-d2b3478d9dae/volumes" Oct 01 13:24:16 crc kubenswrapper[4851]: I1001 13:24:16.670773 4851 scope.go:117] "RemoveContainer" containerID="b4aa968fda4aaa434ad0dfbaced946667eea5f12c8a9302752650647eaa06653" Oct 01 13:24:16 crc kubenswrapper[4851]: I1001 13:24:16.701220 4851 scope.go:117] "RemoveContainer" containerID="60bf57a00bf28e2e0867c6e2e58142fcce14ec0f60100711fd8fd8ee0e933904" Oct 01 13:24:16 crc kubenswrapper[4851]: I1001 13:24:16.781600 4851 scope.go:117] "RemoveContainer" containerID="99b4ea3c0fcd4ea0a7809a3b999003e995bb59a8ca7fef21a8aaae20873e59b2" Oct 01 13:24:16 crc kubenswrapper[4851]: I1001 13:24:16.852523 4851 scope.go:117] "RemoveContainer" containerID="b34c5f5f25ad1f532fa333e66fa3f7f64c12594ef6c6d5ee4b481d76a2a5fbbf" Oct 01 13:24:16 crc kubenswrapper[4851]: I1001 13:24:16.918921 4851 scope.go:117] "RemoveContainer" containerID="8ba017052fe7382141520764bcb6eefb2e1e996457cd67073bd2717c8ca8b391" Oct 01 13:24:16 crc kubenswrapper[4851]: I1001 13:24:16.962475 4851 scope.go:117] "RemoveContainer" containerID="0ff455f8bbac0fc423335e4fe194bf52dc1d6f9849ca6aee36e83f57cceff6e1" Oct 01 13:24:17 crc kubenswrapper[4851]: I1001 13:24:17.003857 4851 scope.go:117] "RemoveContainer" containerID="138e17912a61a9728505be01bffd19e9f6001144eb35aa8790beea2649d9e8d0" Oct 01 13:24:17 crc kubenswrapper[4851]: I1001 13:24:17.050920 4851 scope.go:117] "RemoveContainer" containerID="87f349687f0c6455251e99734637cc08652e820d4dd4e32b98d2c0cb03974969" Oct 01 13:24:22 crc kubenswrapper[4851]: I1001 13:24:22.341347 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:24:22 crc kubenswrapper[4851]: E1001 13:24:22.342399 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:24:24 crc kubenswrapper[4851]: I1001 13:24:24.258340 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9vxxj"] Oct 01 13:24:24 crc kubenswrapper[4851]: I1001 13:24:24.263279 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:24 crc kubenswrapper[4851]: I1001 13:24:24.274261 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9vxxj"] Oct 01 13:24:24 crc kubenswrapper[4851]: I1001 13:24:24.401729 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e70d56b-a03d-4466-847d-7dad7c8bb05c-catalog-content\") pod \"community-operators-9vxxj\" (UID: \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\") " pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:24 crc kubenswrapper[4851]: I1001 13:24:24.401936 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45szx\" (UniqueName: \"kubernetes.io/projected/1e70d56b-a03d-4466-847d-7dad7c8bb05c-kube-api-access-45szx\") pod \"community-operators-9vxxj\" (UID: \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\") " pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:24 crc kubenswrapper[4851]: I1001 13:24:24.401981 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e70d56b-a03d-4466-847d-7dad7c8bb05c-utilities\") pod \"community-operators-9vxxj\" (UID: \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\") " pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:24 crc kubenswrapper[4851]: I1001 13:24:24.503737 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45szx\" (UniqueName: \"kubernetes.io/projected/1e70d56b-a03d-4466-847d-7dad7c8bb05c-kube-api-access-45szx\") pod \"community-operators-9vxxj\" (UID: \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\") " pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:24 crc kubenswrapper[4851]: I1001 13:24:24.503804 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e70d56b-a03d-4466-847d-7dad7c8bb05c-utilities\") pod \"community-operators-9vxxj\" (UID: \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\") " pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:24 crc kubenswrapper[4851]: I1001 13:24:24.503953 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e70d56b-a03d-4466-847d-7dad7c8bb05c-catalog-content\") pod \"community-operators-9vxxj\" (UID: \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\") " pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:24 crc kubenswrapper[4851]: I1001 13:24:24.504617 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e70d56b-a03d-4466-847d-7dad7c8bb05c-utilities\") pod \"community-operators-9vxxj\" (UID: \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\") " pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:24 crc kubenswrapper[4851]: I1001 13:24:24.504638 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e70d56b-a03d-4466-847d-7dad7c8bb05c-catalog-content\") pod \"community-operators-9vxxj\" (UID: \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\") " pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:24 crc kubenswrapper[4851]: I1001 13:24:24.531277 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45szx\" (UniqueName: \"kubernetes.io/projected/1e70d56b-a03d-4466-847d-7dad7c8bb05c-kube-api-access-45szx\") pod \"community-operators-9vxxj\" (UID: \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\") " pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:24 crc kubenswrapper[4851]: I1001 13:24:24.602146 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:25 crc kubenswrapper[4851]: I1001 13:24:25.153186 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9vxxj"] Oct 01 13:24:25 crc kubenswrapper[4851]: W1001 13:24:25.160644 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e70d56b_a03d_4466_847d_7dad7c8bb05c.slice/crio-591b6b75f39ad8054e7a9b73cfdfb01b859dd11de4ba87dad9e51544f9b09a46 WatchSource:0}: Error finding container 591b6b75f39ad8054e7a9b73cfdfb01b859dd11de4ba87dad9e51544f9b09a46: Status 404 returned error can't find the container with id 591b6b75f39ad8054e7a9b73cfdfb01b859dd11de4ba87dad9e51544f9b09a46 Oct 01 13:24:25 crc kubenswrapper[4851]: I1001 13:24:25.736889 4851 generic.go:334] "Generic (PLEG): container finished" podID="1e70d56b-a03d-4466-847d-7dad7c8bb05c" containerID="1cbe3ae516c6c57ecda03ee0624f75f84eaaf6752f92c0e20d1101aa832c882e" exitCode=0 Oct 01 13:24:25 crc kubenswrapper[4851]: I1001 13:24:25.736952 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vxxj" event={"ID":"1e70d56b-a03d-4466-847d-7dad7c8bb05c","Type":"ContainerDied","Data":"1cbe3ae516c6c57ecda03ee0624f75f84eaaf6752f92c0e20d1101aa832c882e"} Oct 01 13:24:25 crc kubenswrapper[4851]: I1001 13:24:25.737270 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vxxj" event={"ID":"1e70d56b-a03d-4466-847d-7dad7c8bb05c","Type":"ContainerStarted","Data":"591b6b75f39ad8054e7a9b73cfdfb01b859dd11de4ba87dad9e51544f9b09a46"} Oct 01 13:24:26 crc kubenswrapper[4851]: I1001 13:24:26.754030 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vxxj" event={"ID":"1e70d56b-a03d-4466-847d-7dad7c8bb05c","Type":"ContainerStarted","Data":"4c63f4e8377cc57d3d121d3a435c6f599e455240341ae15ed0d0737679ef11af"} Oct 01 13:24:27 crc kubenswrapper[4851]: I1001 13:24:27.773450 4851 generic.go:334] "Generic (PLEG): container finished" podID="1e70d56b-a03d-4466-847d-7dad7c8bb05c" containerID="4c63f4e8377cc57d3d121d3a435c6f599e455240341ae15ed0d0737679ef11af" exitCode=0 Oct 01 13:24:27 crc kubenswrapper[4851]: I1001 13:24:27.773684 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vxxj" event={"ID":"1e70d56b-a03d-4466-847d-7dad7c8bb05c","Type":"ContainerDied","Data":"4c63f4e8377cc57d3d121d3a435c6f599e455240341ae15ed0d0737679ef11af"} Oct 01 13:24:28 crc kubenswrapper[4851]: I1001 13:24:28.787306 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vxxj" event={"ID":"1e70d56b-a03d-4466-847d-7dad7c8bb05c","Type":"ContainerStarted","Data":"9e8b33cc89a2c4688f95a9312dfda2fdffee2e599258bffe4db40cef0982939a"} Oct 01 13:24:28 crc kubenswrapper[4851]: I1001 13:24:28.816825 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9vxxj" podStartSLOduration=2.212499686 podStartE2EDuration="4.816796568s" podCreationTimestamp="2025-10-01 13:24:24 +0000 UTC" firstStartedPulling="2025-10-01 13:24:25.738647907 +0000 UTC m=+1874.083765433" lastFinishedPulling="2025-10-01 13:24:28.342944809 +0000 UTC m=+1876.688062315" observedRunningTime="2025-10-01 13:24:28.81195974 +0000 UTC m=+1877.157077226" watchObservedRunningTime="2025-10-01 13:24:28.816796568 +0000 UTC m=+1877.161914094" Oct 01 13:24:33 crc kubenswrapper[4851]: I1001 13:24:33.329095 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:24:33 crc kubenswrapper[4851]: I1001 13:24:33.846913 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"2cf857fcafcc27114560c9d5ed62e037c4c2e05f3ac9187b9ce4b4a9bc35966e"} Oct 01 13:24:34 crc kubenswrapper[4851]: I1001 13:24:34.036906 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-31e1-account-create-rczp4"] Oct 01 13:24:34 crc kubenswrapper[4851]: I1001 13:24:34.052735 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b1ae-account-create-glmhj"] Oct 01 13:24:34 crc kubenswrapper[4851]: I1001 13:24:34.060147 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4ab6-account-create-tfwql"] Oct 01 13:24:34 crc kubenswrapper[4851]: I1001 13:24:34.067487 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b1ae-account-create-glmhj"] Oct 01 13:24:34 crc kubenswrapper[4851]: I1001 13:24:34.074411 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4ab6-account-create-tfwql"] Oct 01 13:24:34 crc kubenswrapper[4851]: I1001 13:24:34.080463 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-31e1-account-create-rczp4"] Oct 01 13:24:34 crc kubenswrapper[4851]: I1001 13:24:34.341457 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="417dd621-4a2f-45a8-8060-26e9584b1916" path="/var/lib/kubelet/pods/417dd621-4a2f-45a8-8060-26e9584b1916/volumes" Oct 01 13:24:34 crc kubenswrapper[4851]: I1001 13:24:34.342358 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f78fae-38e4-48f5-9318-34c712a404d2" path="/var/lib/kubelet/pods/44f78fae-38e4-48f5-9318-34c712a404d2/volumes" Oct 01 13:24:34 crc kubenswrapper[4851]: I1001 13:24:34.343391 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d8656a-3cbc-4868-9e5e-4be315aa3a8d" path="/var/lib/kubelet/pods/68d8656a-3cbc-4868-9e5e-4be315aa3a8d/volumes" Oct 01 13:24:34 crc kubenswrapper[4851]: I1001 13:24:34.603364 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:34 crc kubenswrapper[4851]: I1001 13:24:34.603971 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:34 crc kubenswrapper[4851]: I1001 13:24:34.699043 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:34 crc kubenswrapper[4851]: I1001 13:24:34.949986 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:35 crc kubenswrapper[4851]: I1001 13:24:35.032076 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9vxxj"] Oct 01 13:24:36 crc kubenswrapper[4851]: I1001 13:24:36.885897 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9vxxj" podUID="1e70d56b-a03d-4466-847d-7dad7c8bb05c" containerName="registry-server" containerID="cri-o://9e8b33cc89a2c4688f95a9312dfda2fdffee2e599258bffe4db40cef0982939a" gracePeriod=2 Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.431486 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.561907 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e70d56b-a03d-4466-847d-7dad7c8bb05c-catalog-content\") pod \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\" (UID: \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\") " Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.562485 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e70d56b-a03d-4466-847d-7dad7c8bb05c-utilities\") pod \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\" (UID: \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\") " Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.562691 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45szx\" (UniqueName: \"kubernetes.io/projected/1e70d56b-a03d-4466-847d-7dad7c8bb05c-kube-api-access-45szx\") pod \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\" (UID: \"1e70d56b-a03d-4466-847d-7dad7c8bb05c\") " Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.563371 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e70d56b-a03d-4466-847d-7dad7c8bb05c-utilities" (OuterVolumeSpecName: "utilities") pod "1e70d56b-a03d-4466-847d-7dad7c8bb05c" (UID: "1e70d56b-a03d-4466-847d-7dad7c8bb05c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.563764 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e70d56b-a03d-4466-847d-7dad7c8bb05c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.568709 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e70d56b-a03d-4466-847d-7dad7c8bb05c-kube-api-access-45szx" (OuterVolumeSpecName: "kube-api-access-45szx") pod "1e70d56b-a03d-4466-847d-7dad7c8bb05c" (UID: "1e70d56b-a03d-4466-847d-7dad7c8bb05c"). InnerVolumeSpecName "kube-api-access-45szx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.666599 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45szx\" (UniqueName: \"kubernetes.io/projected/1e70d56b-a03d-4466-847d-7dad7c8bb05c-kube-api-access-45szx\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.899740 4851 generic.go:334] "Generic (PLEG): container finished" podID="1e70d56b-a03d-4466-847d-7dad7c8bb05c" containerID="9e8b33cc89a2c4688f95a9312dfda2fdffee2e599258bffe4db40cef0982939a" exitCode=0 Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.899789 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vxxj" event={"ID":"1e70d56b-a03d-4466-847d-7dad7c8bb05c","Type":"ContainerDied","Data":"9e8b33cc89a2c4688f95a9312dfda2fdffee2e599258bffe4db40cef0982939a"} Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.899837 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vxxj" Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.899854 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vxxj" event={"ID":"1e70d56b-a03d-4466-847d-7dad7c8bb05c","Type":"ContainerDied","Data":"591b6b75f39ad8054e7a9b73cfdfb01b859dd11de4ba87dad9e51544f9b09a46"} Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.899883 4851 scope.go:117] "RemoveContainer" containerID="9e8b33cc89a2c4688f95a9312dfda2fdffee2e599258bffe4db40cef0982939a" Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.931680 4851 scope.go:117] "RemoveContainer" containerID="4c63f4e8377cc57d3d121d3a435c6f599e455240341ae15ed0d0737679ef11af" Oct 01 13:24:37 crc kubenswrapper[4851]: I1001 13:24:37.972573 4851 scope.go:117] "RemoveContainer" containerID="1cbe3ae516c6c57ecda03ee0624f75f84eaaf6752f92c0e20d1101aa832c882e" Oct 01 13:24:38 crc kubenswrapper[4851]: I1001 13:24:38.044781 4851 scope.go:117] "RemoveContainer" containerID="9e8b33cc89a2c4688f95a9312dfda2fdffee2e599258bffe4db40cef0982939a" Oct 01 13:24:38 crc kubenswrapper[4851]: E1001 13:24:38.045463 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e8b33cc89a2c4688f95a9312dfda2fdffee2e599258bffe4db40cef0982939a\": container with ID starting with 9e8b33cc89a2c4688f95a9312dfda2fdffee2e599258bffe4db40cef0982939a not found: ID does not exist" containerID="9e8b33cc89a2c4688f95a9312dfda2fdffee2e599258bffe4db40cef0982939a" Oct 01 13:24:38 crc kubenswrapper[4851]: I1001 13:24:38.045529 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e8b33cc89a2c4688f95a9312dfda2fdffee2e599258bffe4db40cef0982939a"} err="failed to get container status \"9e8b33cc89a2c4688f95a9312dfda2fdffee2e599258bffe4db40cef0982939a\": rpc error: code = NotFound desc = could not find container \"9e8b33cc89a2c4688f95a9312dfda2fdffee2e599258bffe4db40cef0982939a\": container with ID starting with 9e8b33cc89a2c4688f95a9312dfda2fdffee2e599258bffe4db40cef0982939a not found: ID does not exist" Oct 01 13:24:38 crc kubenswrapper[4851]: I1001 13:24:38.045562 4851 scope.go:117] "RemoveContainer" containerID="4c63f4e8377cc57d3d121d3a435c6f599e455240341ae15ed0d0737679ef11af" Oct 01 13:24:38 crc kubenswrapper[4851]: E1001 13:24:38.046059 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c63f4e8377cc57d3d121d3a435c6f599e455240341ae15ed0d0737679ef11af\": container with ID starting with 4c63f4e8377cc57d3d121d3a435c6f599e455240341ae15ed0d0737679ef11af not found: ID does not exist" containerID="4c63f4e8377cc57d3d121d3a435c6f599e455240341ae15ed0d0737679ef11af" Oct 01 13:24:38 crc kubenswrapper[4851]: I1001 13:24:38.046095 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c63f4e8377cc57d3d121d3a435c6f599e455240341ae15ed0d0737679ef11af"} err="failed to get container status \"4c63f4e8377cc57d3d121d3a435c6f599e455240341ae15ed0d0737679ef11af\": rpc error: code = NotFound desc = could not find container \"4c63f4e8377cc57d3d121d3a435c6f599e455240341ae15ed0d0737679ef11af\": container with ID starting with 4c63f4e8377cc57d3d121d3a435c6f599e455240341ae15ed0d0737679ef11af not found: ID does not exist" Oct 01 13:24:38 crc kubenswrapper[4851]: I1001 13:24:38.046119 4851 scope.go:117] "RemoveContainer" containerID="1cbe3ae516c6c57ecda03ee0624f75f84eaaf6752f92c0e20d1101aa832c882e" Oct 01 13:24:38 crc kubenswrapper[4851]: E1001 13:24:38.046538 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cbe3ae516c6c57ecda03ee0624f75f84eaaf6752f92c0e20d1101aa832c882e\": container with ID starting with 1cbe3ae516c6c57ecda03ee0624f75f84eaaf6752f92c0e20d1101aa832c882e not found: ID does not exist" containerID="1cbe3ae516c6c57ecda03ee0624f75f84eaaf6752f92c0e20d1101aa832c882e" Oct 01 13:24:38 crc kubenswrapper[4851]: I1001 13:24:38.046561 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cbe3ae516c6c57ecda03ee0624f75f84eaaf6752f92c0e20d1101aa832c882e"} err="failed to get container status \"1cbe3ae516c6c57ecda03ee0624f75f84eaaf6752f92c0e20d1101aa832c882e\": rpc error: code = NotFound desc = could not find container \"1cbe3ae516c6c57ecda03ee0624f75f84eaaf6752f92c0e20d1101aa832c882e\": container with ID starting with 1cbe3ae516c6c57ecda03ee0624f75f84eaaf6752f92c0e20d1101aa832c882e not found: ID does not exist" Oct 01 13:24:38 crc kubenswrapper[4851]: I1001 13:24:38.436414 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e70d56b-a03d-4466-847d-7dad7c8bb05c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e70d56b-a03d-4466-847d-7dad7c8bb05c" (UID: "1e70d56b-a03d-4466-847d-7dad7c8bb05c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:24:38 crc kubenswrapper[4851]: I1001 13:24:38.483232 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e70d56b-a03d-4466-847d-7dad7c8bb05c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:38 crc kubenswrapper[4851]: I1001 13:24:38.559609 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9vxxj"] Oct 01 13:24:38 crc kubenswrapper[4851]: I1001 13:24:38.569647 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9vxxj"] Oct 01 13:24:40 crc kubenswrapper[4851]: I1001 13:24:40.346166 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e70d56b-a03d-4466-847d-7dad7c8bb05c" path="/var/lib/kubelet/pods/1e70d56b-a03d-4466-847d-7dad7c8bb05c/volumes" Oct 01 13:25:03 crc kubenswrapper[4851]: I1001 13:25:03.042056 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dx6g6"] Oct 01 13:25:03 crc kubenswrapper[4851]: I1001 13:25:03.051314 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dx6g6"] Oct 01 13:25:04 crc kubenswrapper[4851]: I1001 13:25:04.338153 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a3816a8-ed2c-4f4c-a4ba-485b9182a68e" path="/var/lib/kubelet/pods/1a3816a8-ed2c-4f4c-a4ba-485b9182a68e/volumes" Oct 01 13:25:17 crc kubenswrapper[4851]: I1001 13:25:17.306333 4851 scope.go:117] "RemoveContainer" containerID="77c9cfdf83b5158040fe6acf29bdbff56a92e63f3eed5e4825987259fd59c0a7" Oct 01 13:25:17 crc kubenswrapper[4851]: I1001 13:25:17.350775 4851 scope.go:117] "RemoveContainer" containerID="69f651f01f7708fea9cf46f422c6823c2ccd1b53dd4dbf05a8778f2235def0c3" Oct 01 13:25:17 crc kubenswrapper[4851]: I1001 13:25:17.409844 4851 scope.go:117] "RemoveContainer" containerID="5cccac4181ff635360aa92dde6191f39997a1d7e3be9fff27bb1844e58cddce5" Oct 01 13:25:17 crc kubenswrapper[4851]: I1001 13:25:17.458561 4851 scope.go:117] "RemoveContainer" containerID="dd1647f0669f311c2c2126b67d0997ad5e6912fd8eaf2618076fcd267c28eabb" Oct 01 13:25:25 crc kubenswrapper[4851]: I1001 13:25:25.479533 4851 generic.go:334] "Generic (PLEG): container finished" podID="4e073c07-f76e-424b-a1d1-68fcabf7f063" containerID="8bed364f24c100631dadbc1308c55390300820cfd5eeacc30546318a1cb43f9f" exitCode=0 Oct 01 13:25:25 crc kubenswrapper[4851]: I1001 13:25:25.479638 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" event={"ID":"4e073c07-f76e-424b-a1d1-68fcabf7f063","Type":"ContainerDied","Data":"8bed364f24c100631dadbc1308c55390300820cfd5eeacc30546318a1cb43f9f"} Oct 01 13:25:26 crc kubenswrapper[4851]: I1001 13:25:26.046162 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-7shqh"] Oct 01 13:25:26 crc kubenswrapper[4851]: I1001 13:25:26.059042 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-7shqh"] Oct 01 13:25:26 crc kubenswrapper[4851]: I1001 13:25:26.349870 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed619e77-8ae0-4eaf-8c68-611b8883f603" path="/var/lib/kubelet/pods/ed619e77-8ae0-4eaf-8c68-611b8883f603/volumes" Oct 01 13:25:26 crc kubenswrapper[4851]: I1001 13:25:26.978392 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.153078 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e073c07-f76e-424b-a1d1-68fcabf7f063-ssh-key\") pod \"4e073c07-f76e-424b-a1d1-68fcabf7f063\" (UID: \"4e073c07-f76e-424b-a1d1-68fcabf7f063\") " Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.153672 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xszbs\" (UniqueName: \"kubernetes.io/projected/4e073c07-f76e-424b-a1d1-68fcabf7f063-kube-api-access-xszbs\") pod \"4e073c07-f76e-424b-a1d1-68fcabf7f063\" (UID: \"4e073c07-f76e-424b-a1d1-68fcabf7f063\") " Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.154072 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e073c07-f76e-424b-a1d1-68fcabf7f063-inventory\") pod \"4e073c07-f76e-424b-a1d1-68fcabf7f063\" (UID: \"4e073c07-f76e-424b-a1d1-68fcabf7f063\") " Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.160128 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e073c07-f76e-424b-a1d1-68fcabf7f063-kube-api-access-xszbs" (OuterVolumeSpecName: "kube-api-access-xszbs") pod "4e073c07-f76e-424b-a1d1-68fcabf7f063" (UID: "4e073c07-f76e-424b-a1d1-68fcabf7f063"). InnerVolumeSpecName "kube-api-access-xszbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.214983 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e073c07-f76e-424b-a1d1-68fcabf7f063-inventory" (OuterVolumeSpecName: "inventory") pod "4e073c07-f76e-424b-a1d1-68fcabf7f063" (UID: "4e073c07-f76e-424b-a1d1-68fcabf7f063"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.215012 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e073c07-f76e-424b-a1d1-68fcabf7f063-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4e073c07-f76e-424b-a1d1-68fcabf7f063" (UID: "4e073c07-f76e-424b-a1d1-68fcabf7f063"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.256114 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e073c07-f76e-424b-a1d1-68fcabf7f063-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.256147 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e073c07-f76e-424b-a1d1-68fcabf7f063-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.256157 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xszbs\" (UniqueName: \"kubernetes.io/projected/4e073c07-f76e-424b-a1d1-68fcabf7f063-kube-api-access-xszbs\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.500916 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" event={"ID":"4e073c07-f76e-424b-a1d1-68fcabf7f063","Type":"ContainerDied","Data":"3688b80c89c5a86203dc125994165f933e2a8f651ac3ded35e28e93f0731dae6"} Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.501239 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3688b80c89c5a86203dc125994165f933e2a8f651ac3ded35e28e93f0731dae6" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.500989 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.643814 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6"] Oct 01 13:25:27 crc kubenswrapper[4851]: E1001 13:25:27.644238 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e073c07-f76e-424b-a1d1-68fcabf7f063" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.644260 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e073c07-f76e-424b-a1d1-68fcabf7f063" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:25:27 crc kubenswrapper[4851]: E1001 13:25:27.644277 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e70d56b-a03d-4466-847d-7dad7c8bb05c" containerName="extract-utilities" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.644285 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e70d56b-a03d-4466-847d-7dad7c8bb05c" containerName="extract-utilities" Oct 01 13:25:27 crc kubenswrapper[4851]: E1001 13:25:27.644306 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e70d56b-a03d-4466-847d-7dad7c8bb05c" containerName="extract-content" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.644315 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e70d56b-a03d-4466-847d-7dad7c8bb05c" containerName="extract-content" Oct 01 13:25:27 crc kubenswrapper[4851]: E1001 13:25:27.644329 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e70d56b-a03d-4466-847d-7dad7c8bb05c" containerName="registry-server" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.644336 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e70d56b-a03d-4466-847d-7dad7c8bb05c" containerName="registry-server" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.644533 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e70d56b-a03d-4466-847d-7dad7c8bb05c" containerName="registry-server" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.644546 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e073c07-f76e-424b-a1d1-68fcabf7f063" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.645215 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.648584 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.648785 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.649203 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.648743 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.654530 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6"] Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.771832 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfdnv\" (UniqueName: \"kubernetes.io/projected/8457e33a-c243-4a18-80f6-8a1777d60054-kube-api-access-dfdnv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6\" (UID: \"8457e33a-c243-4a18-80f6-8a1777d60054\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.771944 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8457e33a-c243-4a18-80f6-8a1777d60054-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6\" (UID: \"8457e33a-c243-4a18-80f6-8a1777d60054\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.771965 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8457e33a-c243-4a18-80f6-8a1777d60054-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6\" (UID: \"8457e33a-c243-4a18-80f6-8a1777d60054\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.873468 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8457e33a-c243-4a18-80f6-8a1777d60054-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6\" (UID: \"8457e33a-c243-4a18-80f6-8a1777d60054\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.873515 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8457e33a-c243-4a18-80f6-8a1777d60054-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6\" (UID: \"8457e33a-c243-4a18-80f6-8a1777d60054\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.873609 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfdnv\" (UniqueName: \"kubernetes.io/projected/8457e33a-c243-4a18-80f6-8a1777d60054-kube-api-access-dfdnv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6\" (UID: \"8457e33a-c243-4a18-80f6-8a1777d60054\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.877362 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8457e33a-c243-4a18-80f6-8a1777d60054-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6\" (UID: \"8457e33a-c243-4a18-80f6-8a1777d60054\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.878055 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8457e33a-c243-4a18-80f6-8a1777d60054-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6\" (UID: \"8457e33a-c243-4a18-80f6-8a1777d60054\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.892593 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfdnv\" (UniqueName: \"kubernetes.io/projected/8457e33a-c243-4a18-80f6-8a1777d60054-kube-api-access-dfdnv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6\" (UID: \"8457e33a-c243-4a18-80f6-8a1777d60054\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" Oct 01 13:25:27 crc kubenswrapper[4851]: I1001 13:25:27.965697 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" Oct 01 13:25:28 crc kubenswrapper[4851]: I1001 13:25:28.459224 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6"] Oct 01 13:25:28 crc kubenswrapper[4851]: I1001 13:25:28.513835 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" event={"ID":"8457e33a-c243-4a18-80f6-8a1777d60054","Type":"ContainerStarted","Data":"bde571c713846d100fff879efad38922cf7f9b08d10b2a7317e8ebb01e87ccac"} Oct 01 13:25:29 crc kubenswrapper[4851]: I1001 13:25:29.547885 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" event={"ID":"8457e33a-c243-4a18-80f6-8a1777d60054","Type":"ContainerStarted","Data":"7a50b7ab4520fe83588582c548bb96417bc6687b68a229f6d64c5be34aafb590"} Oct 01 13:25:29 crc kubenswrapper[4851]: I1001 13:25:29.566735 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" podStartSLOduration=2.094436045 podStartE2EDuration="2.566707362s" podCreationTimestamp="2025-10-01 13:25:27 +0000 UTC" firstStartedPulling="2025-10-01 13:25:28.464999116 +0000 UTC m=+1936.810116602" lastFinishedPulling="2025-10-01 13:25:28.937270433 +0000 UTC m=+1937.282387919" observedRunningTime="2025-10-01 13:25:29.566092274 +0000 UTC m=+1937.911209800" watchObservedRunningTime="2025-10-01 13:25:29.566707362 +0000 UTC m=+1937.911824878" Oct 01 13:25:33 crc kubenswrapper[4851]: I1001 13:25:33.038772 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-q4cn9"] Oct 01 13:25:33 crc kubenswrapper[4851]: I1001 13:25:33.051084 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-q4cn9"] Oct 01 13:25:34 crc kubenswrapper[4851]: I1001 13:25:34.338644 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b" path="/var/lib/kubelet/pods/8a092c6d-ccb7-4c6f-b159-d7d6f6b53b7b/volumes" Oct 01 13:25:34 crc kubenswrapper[4851]: I1001 13:25:34.596089 4851 generic.go:334] "Generic (PLEG): container finished" podID="8457e33a-c243-4a18-80f6-8a1777d60054" containerID="7a50b7ab4520fe83588582c548bb96417bc6687b68a229f6d64c5be34aafb590" exitCode=0 Oct 01 13:25:34 crc kubenswrapper[4851]: I1001 13:25:34.596139 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" event={"ID":"8457e33a-c243-4a18-80f6-8a1777d60054","Type":"ContainerDied","Data":"7a50b7ab4520fe83588582c548bb96417bc6687b68a229f6d64c5be34aafb590"} Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.021264 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.151462 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfdnv\" (UniqueName: \"kubernetes.io/projected/8457e33a-c243-4a18-80f6-8a1777d60054-kube-api-access-dfdnv\") pod \"8457e33a-c243-4a18-80f6-8a1777d60054\" (UID: \"8457e33a-c243-4a18-80f6-8a1777d60054\") " Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.151922 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8457e33a-c243-4a18-80f6-8a1777d60054-ssh-key\") pod \"8457e33a-c243-4a18-80f6-8a1777d60054\" (UID: \"8457e33a-c243-4a18-80f6-8a1777d60054\") " Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.151996 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8457e33a-c243-4a18-80f6-8a1777d60054-inventory\") pod \"8457e33a-c243-4a18-80f6-8a1777d60054\" (UID: \"8457e33a-c243-4a18-80f6-8a1777d60054\") " Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.159768 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8457e33a-c243-4a18-80f6-8a1777d60054-kube-api-access-dfdnv" (OuterVolumeSpecName: "kube-api-access-dfdnv") pod "8457e33a-c243-4a18-80f6-8a1777d60054" (UID: "8457e33a-c243-4a18-80f6-8a1777d60054"). InnerVolumeSpecName "kube-api-access-dfdnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.180846 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8457e33a-c243-4a18-80f6-8a1777d60054-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8457e33a-c243-4a18-80f6-8a1777d60054" (UID: "8457e33a-c243-4a18-80f6-8a1777d60054"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.203820 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8457e33a-c243-4a18-80f6-8a1777d60054-inventory" (OuterVolumeSpecName: "inventory") pod "8457e33a-c243-4a18-80f6-8a1777d60054" (UID: "8457e33a-c243-4a18-80f6-8a1777d60054"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.255635 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8457e33a-c243-4a18-80f6-8a1777d60054-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.255668 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfdnv\" (UniqueName: \"kubernetes.io/projected/8457e33a-c243-4a18-80f6-8a1777d60054-kube-api-access-dfdnv\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.255682 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8457e33a-c243-4a18-80f6-8a1777d60054-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.614438 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" event={"ID":"8457e33a-c243-4a18-80f6-8a1777d60054","Type":"ContainerDied","Data":"bde571c713846d100fff879efad38922cf7f9b08d10b2a7317e8ebb01e87ccac"} Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.614478 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bde571c713846d100fff879efad38922cf7f9b08d10b2a7317e8ebb01e87ccac" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.614557 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.715604 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6"] Oct 01 13:25:36 crc kubenswrapper[4851]: E1001 13:25:36.716014 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8457e33a-c243-4a18-80f6-8a1777d60054" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.716031 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="8457e33a-c243-4a18-80f6-8a1777d60054" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.716246 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="8457e33a-c243-4a18-80f6-8a1777d60054" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.717512 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.721045 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.721109 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.721289 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.739863 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.745248 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6"] Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.765642 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-swrg6\" (UID: \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.765705 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-swrg6\" (UID: \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.765781 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvtk6\" (UniqueName: \"kubernetes.io/projected/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-kube-api-access-rvtk6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-swrg6\" (UID: \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.867968 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-swrg6\" (UID: \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.868056 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-swrg6\" (UID: \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.868129 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvtk6\" (UniqueName: \"kubernetes.io/projected/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-kube-api-access-rvtk6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-swrg6\" (UID: \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.872285 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-swrg6\" (UID: \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.872422 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-swrg6\" (UID: \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" Oct 01 13:25:36 crc kubenswrapper[4851]: I1001 13:25:36.886644 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvtk6\" (UniqueName: \"kubernetes.io/projected/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-kube-api-access-rvtk6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-swrg6\" (UID: \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" Oct 01 13:25:37 crc kubenswrapper[4851]: I1001 13:25:37.052973 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" Oct 01 13:25:37 crc kubenswrapper[4851]: I1001 13:25:37.629960 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6"] Oct 01 13:25:38 crc kubenswrapper[4851]: I1001 13:25:38.648878 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" event={"ID":"b850e5c8-7c3d-4ae2-ab48-a17e80c41091","Type":"ContainerStarted","Data":"f9d98293b05c64157da2b64213e5e717005e2e9e7a796e894af1137bb35c35bc"} Oct 01 13:25:38 crc kubenswrapper[4851]: I1001 13:25:38.649267 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" event={"ID":"b850e5c8-7c3d-4ae2-ab48-a17e80c41091","Type":"ContainerStarted","Data":"1e16ac197d0835b9498ffcc6a1d671105efc8c1acbd4abf78acdeb04087395d6"} Oct 01 13:26:10 crc kubenswrapper[4851]: I1001 13:26:10.048781 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" podStartSLOduration=33.532731511 podStartE2EDuration="34.048761025s" podCreationTimestamp="2025-10-01 13:25:36 +0000 UTC" firstStartedPulling="2025-10-01 13:25:37.64397584 +0000 UTC m=+1945.989093316" lastFinishedPulling="2025-10-01 13:25:38.160005334 +0000 UTC m=+1946.505122830" observedRunningTime="2025-10-01 13:25:38.664690535 +0000 UTC m=+1947.009808041" watchObservedRunningTime="2025-10-01 13:26:10.048761025 +0000 UTC m=+1978.393878511" Oct 01 13:26:10 crc kubenswrapper[4851]: I1001 13:26:10.052339 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-krnh6"] Oct 01 13:26:10 crc kubenswrapper[4851]: I1001 13:26:10.060555 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-krnh6"] Oct 01 13:26:10 crc kubenswrapper[4851]: I1001 13:26:10.351277 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e7c9248-7155-4f8b-a070-3374a93bc56b" path="/var/lib/kubelet/pods/4e7c9248-7155-4f8b-a070-3374a93bc56b/volumes" Oct 01 13:26:17 crc kubenswrapper[4851]: I1001 13:26:17.599421 4851 scope.go:117] "RemoveContainer" containerID="6a05f3b09fa8ebef756d7566475da11afa25007ecf5b1addb8f181475dbd6894" Oct 01 13:26:17 crc kubenswrapper[4851]: I1001 13:26:17.662020 4851 scope.go:117] "RemoveContainer" containerID="48567905938e95dcebd6cf2bd21f5a663d779580c6cf0f0e7c1b928816e4a449" Oct 01 13:26:17 crc kubenswrapper[4851]: I1001 13:26:17.730289 4851 scope.go:117] "RemoveContainer" containerID="37fdd75b45485c986f60c17b2c07beddfef2a71d34a715c47676888c61459a4f" Oct 01 13:26:28 crc kubenswrapper[4851]: I1001 13:26:28.219447 4851 generic.go:334] "Generic (PLEG): container finished" podID="b850e5c8-7c3d-4ae2-ab48-a17e80c41091" containerID="f9d98293b05c64157da2b64213e5e717005e2e9e7a796e894af1137bb35c35bc" exitCode=0 Oct 01 13:26:28 crc kubenswrapper[4851]: I1001 13:26:28.219541 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" event={"ID":"b850e5c8-7c3d-4ae2-ab48-a17e80c41091","Type":"ContainerDied","Data":"f9d98293b05c64157da2b64213e5e717005e2e9e7a796e894af1137bb35c35bc"} Oct 01 13:26:29 crc kubenswrapper[4851]: I1001 13:26:29.701058 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" Oct 01 13:26:29 crc kubenswrapper[4851]: I1001 13:26:29.812726 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-inventory\") pod \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\" (UID: \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\") " Oct 01 13:26:29 crc kubenswrapper[4851]: I1001 13:26:29.812979 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvtk6\" (UniqueName: \"kubernetes.io/projected/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-kube-api-access-rvtk6\") pod \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\" (UID: \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\") " Oct 01 13:26:29 crc kubenswrapper[4851]: I1001 13:26:29.813065 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-ssh-key\") pod \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\" (UID: \"b850e5c8-7c3d-4ae2-ab48-a17e80c41091\") " Oct 01 13:26:29 crc kubenswrapper[4851]: I1001 13:26:29.819745 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-kube-api-access-rvtk6" (OuterVolumeSpecName: "kube-api-access-rvtk6") pod "b850e5c8-7c3d-4ae2-ab48-a17e80c41091" (UID: "b850e5c8-7c3d-4ae2-ab48-a17e80c41091"). InnerVolumeSpecName "kube-api-access-rvtk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:29 crc kubenswrapper[4851]: I1001 13:26:29.847329 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-inventory" (OuterVolumeSpecName: "inventory") pod "b850e5c8-7c3d-4ae2-ab48-a17e80c41091" (UID: "b850e5c8-7c3d-4ae2-ab48-a17e80c41091"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:29 crc kubenswrapper[4851]: I1001 13:26:29.856717 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b850e5c8-7c3d-4ae2-ab48-a17e80c41091" (UID: "b850e5c8-7c3d-4ae2-ab48-a17e80c41091"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:29 crc kubenswrapper[4851]: I1001 13:26:29.915098 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:29 crc kubenswrapper[4851]: I1001 13:26:29.915128 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:29 crc kubenswrapper[4851]: I1001 13:26:29.915140 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvtk6\" (UniqueName: \"kubernetes.io/projected/b850e5c8-7c3d-4ae2-ab48-a17e80c41091-kube-api-access-rvtk6\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.243142 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" event={"ID":"b850e5c8-7c3d-4ae2-ab48-a17e80c41091","Type":"ContainerDied","Data":"1e16ac197d0835b9498ffcc6a1d671105efc8c1acbd4abf78acdeb04087395d6"} Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.243605 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e16ac197d0835b9498ffcc6a1d671105efc8c1acbd4abf78acdeb04087395d6" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.243208 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-swrg6" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.360796 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd"] Oct 01 13:26:30 crc kubenswrapper[4851]: E1001 13:26:30.367228 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b850e5c8-7c3d-4ae2-ab48-a17e80c41091" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.367281 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="b850e5c8-7c3d-4ae2-ab48-a17e80c41091" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.368954 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="b850e5c8-7c3d-4ae2-ab48-a17e80c41091" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.371042 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.374999 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.375445 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.375601 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.375719 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.389244 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd"] Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.430819 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0168fd9f-0f7b-432d-a09d-927ac34e34b3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd\" (UID: \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.431181 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0168fd9f-0f7b-432d-a09d-927ac34e34b3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd\" (UID: \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.431576 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6nr5\" (UniqueName: \"kubernetes.io/projected/0168fd9f-0f7b-432d-a09d-927ac34e34b3-kube-api-access-w6nr5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd\" (UID: \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.533765 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6nr5\" (UniqueName: \"kubernetes.io/projected/0168fd9f-0f7b-432d-a09d-927ac34e34b3-kube-api-access-w6nr5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd\" (UID: \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.533981 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0168fd9f-0f7b-432d-a09d-927ac34e34b3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd\" (UID: \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.534112 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0168fd9f-0f7b-432d-a09d-927ac34e34b3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd\" (UID: \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.540579 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0168fd9f-0f7b-432d-a09d-927ac34e34b3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd\" (UID: \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.545307 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0168fd9f-0f7b-432d-a09d-927ac34e34b3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd\" (UID: \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.556148 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6nr5\" (UniqueName: \"kubernetes.io/projected/0168fd9f-0f7b-432d-a09d-927ac34e34b3-kube-api-access-w6nr5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd\" (UID: \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" Oct 01 13:26:30 crc kubenswrapper[4851]: I1001 13:26:30.705392 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" Oct 01 13:26:31 crc kubenswrapper[4851]: I1001 13:26:31.327776 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd"] Oct 01 13:26:32 crc kubenswrapper[4851]: I1001 13:26:32.276825 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" event={"ID":"0168fd9f-0f7b-432d-a09d-927ac34e34b3","Type":"ContainerStarted","Data":"cb027a5406be4685057e334404ee81b529e4b4424e2f015b9429976d1a46841a"} Oct 01 13:26:32 crc kubenswrapper[4851]: I1001 13:26:32.277352 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" event={"ID":"0168fd9f-0f7b-432d-a09d-927ac34e34b3","Type":"ContainerStarted","Data":"34af3a2362d330c64e9a58104454c8149170c89001228baa429c7fbedf3a4faa"} Oct 01 13:26:32 crc kubenswrapper[4851]: I1001 13:26:32.315021 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" podStartSLOduration=1.813894507 podStartE2EDuration="2.314990844s" podCreationTimestamp="2025-10-01 13:26:30 +0000 UTC" firstStartedPulling="2025-10-01 13:26:31.330934675 +0000 UTC m=+1999.676052161" lastFinishedPulling="2025-10-01 13:26:31.832031002 +0000 UTC m=+2000.177148498" observedRunningTime="2025-10-01 13:26:32.304812103 +0000 UTC m=+2000.649929629" watchObservedRunningTime="2025-10-01 13:26:32.314990844 +0000 UTC m=+2000.660108370" Oct 01 13:26:54 crc kubenswrapper[4851]: I1001 13:26:54.933797 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jkphk"] Oct 01 13:26:54 crc kubenswrapper[4851]: I1001 13:26:54.938559 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:26:54 crc kubenswrapper[4851]: I1001 13:26:54.947875 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkphk"] Oct 01 13:26:55 crc kubenswrapper[4851]: I1001 13:26:55.040185 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f726e441-37fb-4033-a42b-61beb60cf307-catalog-content\") pod \"redhat-marketplace-jkphk\" (UID: \"f726e441-37fb-4033-a42b-61beb60cf307\") " pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:26:55 crc kubenswrapper[4851]: I1001 13:26:55.040276 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59xz9\" (UniqueName: \"kubernetes.io/projected/f726e441-37fb-4033-a42b-61beb60cf307-kube-api-access-59xz9\") pod \"redhat-marketplace-jkphk\" (UID: \"f726e441-37fb-4033-a42b-61beb60cf307\") " pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:26:55 crc kubenswrapper[4851]: I1001 13:26:55.040342 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f726e441-37fb-4033-a42b-61beb60cf307-utilities\") pod \"redhat-marketplace-jkphk\" (UID: \"f726e441-37fb-4033-a42b-61beb60cf307\") " pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:26:55 crc kubenswrapper[4851]: I1001 13:26:55.142858 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f726e441-37fb-4033-a42b-61beb60cf307-catalog-content\") pod \"redhat-marketplace-jkphk\" (UID: \"f726e441-37fb-4033-a42b-61beb60cf307\") " pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:26:55 crc kubenswrapper[4851]: I1001 13:26:55.143258 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59xz9\" (UniqueName: \"kubernetes.io/projected/f726e441-37fb-4033-a42b-61beb60cf307-kube-api-access-59xz9\") pod \"redhat-marketplace-jkphk\" (UID: \"f726e441-37fb-4033-a42b-61beb60cf307\") " pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:26:55 crc kubenswrapper[4851]: I1001 13:26:55.143493 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f726e441-37fb-4033-a42b-61beb60cf307-catalog-content\") pod \"redhat-marketplace-jkphk\" (UID: \"f726e441-37fb-4033-a42b-61beb60cf307\") " pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:26:55 crc kubenswrapper[4851]: I1001 13:26:55.143662 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f726e441-37fb-4033-a42b-61beb60cf307-utilities\") pod \"redhat-marketplace-jkphk\" (UID: \"f726e441-37fb-4033-a42b-61beb60cf307\") " pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:26:55 crc kubenswrapper[4851]: I1001 13:26:55.143962 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f726e441-37fb-4033-a42b-61beb60cf307-utilities\") pod \"redhat-marketplace-jkphk\" (UID: \"f726e441-37fb-4033-a42b-61beb60cf307\") " pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:26:55 crc kubenswrapper[4851]: I1001 13:26:55.163768 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59xz9\" (UniqueName: \"kubernetes.io/projected/f726e441-37fb-4033-a42b-61beb60cf307-kube-api-access-59xz9\") pod \"redhat-marketplace-jkphk\" (UID: \"f726e441-37fb-4033-a42b-61beb60cf307\") " pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:26:55 crc kubenswrapper[4851]: I1001 13:26:55.273277 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:26:55 crc kubenswrapper[4851]: I1001 13:26:55.779619 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkphk"] Oct 01 13:26:55 crc kubenswrapper[4851]: W1001 13:26:55.789695 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf726e441_37fb_4033_a42b_61beb60cf307.slice/crio-3baf4bbcef574f57c7530db3c0fef3e4ba49819dbc62b84ae4750a7051f650fa WatchSource:0}: Error finding container 3baf4bbcef574f57c7530db3c0fef3e4ba49819dbc62b84ae4750a7051f650fa: Status 404 returned error can't find the container with id 3baf4bbcef574f57c7530db3c0fef3e4ba49819dbc62b84ae4750a7051f650fa Oct 01 13:26:56 crc kubenswrapper[4851]: I1001 13:26:56.511736 4851 generic.go:334] "Generic (PLEG): container finished" podID="f726e441-37fb-4033-a42b-61beb60cf307" containerID="5a02511dc76343d2d27d4fbe99cf9acb1728e11b29d62476969ff1fb804a4bf5" exitCode=0 Oct 01 13:26:56 crc kubenswrapper[4851]: I1001 13:26:56.512064 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkphk" event={"ID":"f726e441-37fb-4033-a42b-61beb60cf307","Type":"ContainerDied","Data":"5a02511dc76343d2d27d4fbe99cf9acb1728e11b29d62476969ff1fb804a4bf5"} Oct 01 13:26:56 crc kubenswrapper[4851]: I1001 13:26:56.512413 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkphk" event={"ID":"f726e441-37fb-4033-a42b-61beb60cf307","Type":"ContainerStarted","Data":"3baf4bbcef574f57c7530db3c0fef3e4ba49819dbc62b84ae4750a7051f650fa"} Oct 01 13:26:57 crc kubenswrapper[4851]: I1001 13:26:57.526806 4851 generic.go:334] "Generic (PLEG): container finished" podID="f726e441-37fb-4033-a42b-61beb60cf307" containerID="21db3fb6fb1b0f7f6144b26e04d93ad343024cc32f665cb8dad31ebeb631dc24" exitCode=0 Oct 01 13:26:57 crc kubenswrapper[4851]: I1001 13:26:57.526903 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkphk" event={"ID":"f726e441-37fb-4033-a42b-61beb60cf307","Type":"ContainerDied","Data":"21db3fb6fb1b0f7f6144b26e04d93ad343024cc32f665cb8dad31ebeb631dc24"} Oct 01 13:26:58 crc kubenswrapper[4851]: I1001 13:26:58.550037 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkphk" event={"ID":"f726e441-37fb-4033-a42b-61beb60cf307","Type":"ContainerStarted","Data":"9bf403a26f763aa6a54e97d89e24ca4e55a338bf4bba0607e749c12c4ccc2714"} Oct 01 13:26:58 crc kubenswrapper[4851]: I1001 13:26:58.586481 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jkphk" podStartSLOduration=3.127510109 podStartE2EDuration="4.586274195s" podCreationTimestamp="2025-10-01 13:26:54 +0000 UTC" firstStartedPulling="2025-10-01 13:26:56.514695945 +0000 UTC m=+2024.859813431" lastFinishedPulling="2025-10-01 13:26:57.973460031 +0000 UTC m=+2026.318577517" observedRunningTime="2025-10-01 13:26:58.584148104 +0000 UTC m=+2026.929265630" watchObservedRunningTime="2025-10-01 13:26:58.586274195 +0000 UTC m=+2026.931391711" Oct 01 13:27:00 crc kubenswrapper[4851]: I1001 13:27:00.050539 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:27:00 crc kubenswrapper[4851]: I1001 13:27:00.050654 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:27:05 crc kubenswrapper[4851]: I1001 13:27:05.274655 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:27:05 crc kubenswrapper[4851]: I1001 13:27:05.275471 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:27:05 crc kubenswrapper[4851]: I1001 13:27:05.332310 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:27:05 crc kubenswrapper[4851]: I1001 13:27:05.677841 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:27:05 crc kubenswrapper[4851]: I1001 13:27:05.742459 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkphk"] Oct 01 13:27:07 crc kubenswrapper[4851]: I1001 13:27:07.645109 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jkphk" podUID="f726e441-37fb-4033-a42b-61beb60cf307" containerName="registry-server" containerID="cri-o://9bf403a26f763aa6a54e97d89e24ca4e55a338bf4bba0607e749c12c4ccc2714" gracePeriod=2 Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.208194 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.335015 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f726e441-37fb-4033-a42b-61beb60cf307-utilities\") pod \"f726e441-37fb-4033-a42b-61beb60cf307\" (UID: \"f726e441-37fb-4033-a42b-61beb60cf307\") " Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.335229 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f726e441-37fb-4033-a42b-61beb60cf307-catalog-content\") pod \"f726e441-37fb-4033-a42b-61beb60cf307\" (UID: \"f726e441-37fb-4033-a42b-61beb60cf307\") " Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.335251 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59xz9\" (UniqueName: \"kubernetes.io/projected/f726e441-37fb-4033-a42b-61beb60cf307-kube-api-access-59xz9\") pod \"f726e441-37fb-4033-a42b-61beb60cf307\" (UID: \"f726e441-37fb-4033-a42b-61beb60cf307\") " Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.336663 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f726e441-37fb-4033-a42b-61beb60cf307-utilities" (OuterVolumeSpecName: "utilities") pod "f726e441-37fb-4033-a42b-61beb60cf307" (UID: "f726e441-37fb-4033-a42b-61beb60cf307"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.348840 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f726e441-37fb-4033-a42b-61beb60cf307-kube-api-access-59xz9" (OuterVolumeSpecName: "kube-api-access-59xz9") pod "f726e441-37fb-4033-a42b-61beb60cf307" (UID: "f726e441-37fb-4033-a42b-61beb60cf307"). InnerVolumeSpecName "kube-api-access-59xz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.354059 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f726e441-37fb-4033-a42b-61beb60cf307-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f726e441-37fb-4033-a42b-61beb60cf307" (UID: "f726e441-37fb-4033-a42b-61beb60cf307"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.438317 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f726e441-37fb-4033-a42b-61beb60cf307-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.438364 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f726e441-37fb-4033-a42b-61beb60cf307-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.438386 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59xz9\" (UniqueName: \"kubernetes.io/projected/f726e441-37fb-4033-a42b-61beb60cf307-kube-api-access-59xz9\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.658667 4851 generic.go:334] "Generic (PLEG): container finished" podID="f726e441-37fb-4033-a42b-61beb60cf307" containerID="9bf403a26f763aa6a54e97d89e24ca4e55a338bf4bba0607e749c12c4ccc2714" exitCode=0 Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.658733 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkphk" event={"ID":"f726e441-37fb-4033-a42b-61beb60cf307","Type":"ContainerDied","Data":"9bf403a26f763aa6a54e97d89e24ca4e55a338bf4bba0607e749c12c4ccc2714"} Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.658769 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkphk" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.658783 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkphk" event={"ID":"f726e441-37fb-4033-a42b-61beb60cf307","Type":"ContainerDied","Data":"3baf4bbcef574f57c7530db3c0fef3e4ba49819dbc62b84ae4750a7051f650fa"} Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.658812 4851 scope.go:117] "RemoveContainer" containerID="9bf403a26f763aa6a54e97d89e24ca4e55a338bf4bba0607e749c12c4ccc2714" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.697074 4851 scope.go:117] "RemoveContainer" containerID="21db3fb6fb1b0f7f6144b26e04d93ad343024cc32f665cb8dad31ebeb631dc24" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.708679 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkphk"] Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.718164 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkphk"] Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.745755 4851 scope.go:117] "RemoveContainer" containerID="5a02511dc76343d2d27d4fbe99cf9acb1728e11b29d62476969ff1fb804a4bf5" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.775913 4851 scope.go:117] "RemoveContainer" containerID="9bf403a26f763aa6a54e97d89e24ca4e55a338bf4bba0607e749c12c4ccc2714" Oct 01 13:27:08 crc kubenswrapper[4851]: E1001 13:27:08.776402 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bf403a26f763aa6a54e97d89e24ca4e55a338bf4bba0607e749c12c4ccc2714\": container with ID starting with 9bf403a26f763aa6a54e97d89e24ca4e55a338bf4bba0607e749c12c4ccc2714 not found: ID does not exist" containerID="9bf403a26f763aa6a54e97d89e24ca4e55a338bf4bba0607e749c12c4ccc2714" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.776460 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bf403a26f763aa6a54e97d89e24ca4e55a338bf4bba0607e749c12c4ccc2714"} err="failed to get container status \"9bf403a26f763aa6a54e97d89e24ca4e55a338bf4bba0607e749c12c4ccc2714\": rpc error: code = NotFound desc = could not find container \"9bf403a26f763aa6a54e97d89e24ca4e55a338bf4bba0607e749c12c4ccc2714\": container with ID starting with 9bf403a26f763aa6a54e97d89e24ca4e55a338bf4bba0607e749c12c4ccc2714 not found: ID does not exist" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.776536 4851 scope.go:117] "RemoveContainer" containerID="21db3fb6fb1b0f7f6144b26e04d93ad343024cc32f665cb8dad31ebeb631dc24" Oct 01 13:27:08 crc kubenswrapper[4851]: E1001 13:27:08.777013 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21db3fb6fb1b0f7f6144b26e04d93ad343024cc32f665cb8dad31ebeb631dc24\": container with ID starting with 21db3fb6fb1b0f7f6144b26e04d93ad343024cc32f665cb8dad31ebeb631dc24 not found: ID does not exist" containerID="21db3fb6fb1b0f7f6144b26e04d93ad343024cc32f665cb8dad31ebeb631dc24" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.777039 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21db3fb6fb1b0f7f6144b26e04d93ad343024cc32f665cb8dad31ebeb631dc24"} err="failed to get container status \"21db3fb6fb1b0f7f6144b26e04d93ad343024cc32f665cb8dad31ebeb631dc24\": rpc error: code = NotFound desc = could not find container \"21db3fb6fb1b0f7f6144b26e04d93ad343024cc32f665cb8dad31ebeb631dc24\": container with ID starting with 21db3fb6fb1b0f7f6144b26e04d93ad343024cc32f665cb8dad31ebeb631dc24 not found: ID does not exist" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.777054 4851 scope.go:117] "RemoveContainer" containerID="5a02511dc76343d2d27d4fbe99cf9acb1728e11b29d62476969ff1fb804a4bf5" Oct 01 13:27:08 crc kubenswrapper[4851]: E1001 13:27:08.777307 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a02511dc76343d2d27d4fbe99cf9acb1728e11b29d62476969ff1fb804a4bf5\": container with ID starting with 5a02511dc76343d2d27d4fbe99cf9acb1728e11b29d62476969ff1fb804a4bf5 not found: ID does not exist" containerID="5a02511dc76343d2d27d4fbe99cf9acb1728e11b29d62476969ff1fb804a4bf5" Oct 01 13:27:08 crc kubenswrapper[4851]: I1001 13:27:08.777354 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a02511dc76343d2d27d4fbe99cf9acb1728e11b29d62476969ff1fb804a4bf5"} err="failed to get container status \"5a02511dc76343d2d27d4fbe99cf9acb1728e11b29d62476969ff1fb804a4bf5\": rpc error: code = NotFound desc = could not find container \"5a02511dc76343d2d27d4fbe99cf9acb1728e11b29d62476969ff1fb804a4bf5\": container with ID starting with 5a02511dc76343d2d27d4fbe99cf9acb1728e11b29d62476969ff1fb804a4bf5 not found: ID does not exist" Oct 01 13:27:10 crc kubenswrapper[4851]: I1001 13:27:10.351047 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f726e441-37fb-4033-a42b-61beb60cf307" path="/var/lib/kubelet/pods/f726e441-37fb-4033-a42b-61beb60cf307/volumes" Oct 01 13:27:30 crc kubenswrapper[4851]: I1001 13:27:30.049966 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:27:30 crc kubenswrapper[4851]: I1001 13:27:30.050817 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:27:33 crc kubenswrapper[4851]: I1001 13:27:33.971411 4851 generic.go:334] "Generic (PLEG): container finished" podID="0168fd9f-0f7b-432d-a09d-927ac34e34b3" containerID="cb027a5406be4685057e334404ee81b529e4b4424e2f015b9429976d1a46841a" exitCode=2 Oct 01 13:27:33 crc kubenswrapper[4851]: I1001 13:27:33.971553 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" event={"ID":"0168fd9f-0f7b-432d-a09d-927ac34e34b3","Type":"ContainerDied","Data":"cb027a5406be4685057e334404ee81b529e4b4424e2f015b9429976d1a46841a"} Oct 01 13:27:35 crc kubenswrapper[4851]: I1001 13:27:35.539648 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" Oct 01 13:27:35 crc kubenswrapper[4851]: I1001 13:27:35.672760 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0168fd9f-0f7b-432d-a09d-927ac34e34b3-ssh-key\") pod \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\" (UID: \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\") " Oct 01 13:27:35 crc kubenswrapper[4851]: I1001 13:27:35.673078 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0168fd9f-0f7b-432d-a09d-927ac34e34b3-inventory\") pod \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\" (UID: \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\") " Oct 01 13:27:35 crc kubenswrapper[4851]: I1001 13:27:35.673168 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6nr5\" (UniqueName: \"kubernetes.io/projected/0168fd9f-0f7b-432d-a09d-927ac34e34b3-kube-api-access-w6nr5\") pod \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\" (UID: \"0168fd9f-0f7b-432d-a09d-927ac34e34b3\") " Oct 01 13:27:35 crc kubenswrapper[4851]: I1001 13:27:35.683806 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0168fd9f-0f7b-432d-a09d-927ac34e34b3-kube-api-access-w6nr5" (OuterVolumeSpecName: "kube-api-access-w6nr5") pod "0168fd9f-0f7b-432d-a09d-927ac34e34b3" (UID: "0168fd9f-0f7b-432d-a09d-927ac34e34b3"). InnerVolumeSpecName "kube-api-access-w6nr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:27:35 crc kubenswrapper[4851]: I1001 13:27:35.713948 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0168fd9f-0f7b-432d-a09d-927ac34e34b3-inventory" (OuterVolumeSpecName: "inventory") pod "0168fd9f-0f7b-432d-a09d-927ac34e34b3" (UID: "0168fd9f-0f7b-432d-a09d-927ac34e34b3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:35 crc kubenswrapper[4851]: I1001 13:27:35.731721 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0168fd9f-0f7b-432d-a09d-927ac34e34b3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0168fd9f-0f7b-432d-a09d-927ac34e34b3" (UID: "0168fd9f-0f7b-432d-a09d-927ac34e34b3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:35 crc kubenswrapper[4851]: I1001 13:27:35.776562 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0168fd9f-0f7b-432d-a09d-927ac34e34b3-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:35 crc kubenswrapper[4851]: I1001 13:27:35.776696 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6nr5\" (UniqueName: \"kubernetes.io/projected/0168fd9f-0f7b-432d-a09d-927ac34e34b3-kube-api-access-w6nr5\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:35 crc kubenswrapper[4851]: I1001 13:27:35.776892 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0168fd9f-0f7b-432d-a09d-927ac34e34b3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:36 crc kubenswrapper[4851]: I1001 13:27:36.002890 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" event={"ID":"0168fd9f-0f7b-432d-a09d-927ac34e34b3","Type":"ContainerDied","Data":"34af3a2362d330c64e9a58104454c8149170c89001228baa429c7fbedf3a4faa"} Oct 01 13:27:36 crc kubenswrapper[4851]: I1001 13:27:36.002952 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34af3a2362d330c64e9a58104454c8149170c89001228baa429c7fbedf3a4faa" Oct 01 13:27:36 crc kubenswrapper[4851]: I1001 13:27:36.002966 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.057824 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7"] Oct 01 13:27:43 crc kubenswrapper[4851]: E1001 13:27:43.059151 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f726e441-37fb-4033-a42b-61beb60cf307" containerName="extract-utilities" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.059178 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="f726e441-37fb-4033-a42b-61beb60cf307" containerName="extract-utilities" Oct 01 13:27:43 crc kubenswrapper[4851]: E1001 13:27:43.059213 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f726e441-37fb-4033-a42b-61beb60cf307" containerName="extract-content" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.059227 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="f726e441-37fb-4033-a42b-61beb60cf307" containerName="extract-content" Oct 01 13:27:43 crc kubenswrapper[4851]: E1001 13:27:43.059251 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0168fd9f-0f7b-432d-a09d-927ac34e34b3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.059264 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="0168fd9f-0f7b-432d-a09d-927ac34e34b3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:27:43 crc kubenswrapper[4851]: E1001 13:27:43.059292 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f726e441-37fb-4033-a42b-61beb60cf307" containerName="registry-server" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.059303 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="f726e441-37fb-4033-a42b-61beb60cf307" containerName="registry-server" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.059650 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="f726e441-37fb-4033-a42b-61beb60cf307" containerName="registry-server" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.059720 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="0168fd9f-0f7b-432d-a09d-927ac34e34b3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.060904 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.064708 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.064728 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.065112 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.065117 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.086729 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7"] Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.146564 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7966e9d5-430c-417e-9ba2-b53c598831e7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-khbc7\" (UID: \"7966e9d5-430c-417e-9ba2-b53c598831e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.146669 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7966e9d5-430c-417e-9ba2-b53c598831e7-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-khbc7\" (UID: \"7966e9d5-430c-417e-9ba2-b53c598831e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.147021 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s7xg\" (UniqueName: \"kubernetes.io/projected/7966e9d5-430c-417e-9ba2-b53c598831e7-kube-api-access-2s7xg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-khbc7\" (UID: \"7966e9d5-430c-417e-9ba2-b53c598831e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.249892 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s7xg\" (UniqueName: \"kubernetes.io/projected/7966e9d5-430c-417e-9ba2-b53c598831e7-kube-api-access-2s7xg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-khbc7\" (UID: \"7966e9d5-430c-417e-9ba2-b53c598831e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.250129 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7966e9d5-430c-417e-9ba2-b53c598831e7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-khbc7\" (UID: \"7966e9d5-430c-417e-9ba2-b53c598831e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.250279 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7966e9d5-430c-417e-9ba2-b53c598831e7-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-khbc7\" (UID: \"7966e9d5-430c-417e-9ba2-b53c598831e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.260137 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7966e9d5-430c-417e-9ba2-b53c598831e7-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-khbc7\" (UID: \"7966e9d5-430c-417e-9ba2-b53c598831e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.260481 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7966e9d5-430c-417e-9ba2-b53c598831e7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-khbc7\" (UID: \"7966e9d5-430c-417e-9ba2-b53c598831e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.270866 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s7xg\" (UniqueName: \"kubernetes.io/projected/7966e9d5-430c-417e-9ba2-b53c598831e7-kube-api-access-2s7xg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-khbc7\" (UID: \"7966e9d5-430c-417e-9ba2-b53c598831e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" Oct 01 13:27:43 crc kubenswrapper[4851]: I1001 13:27:43.397396 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" Oct 01 13:27:44 crc kubenswrapper[4851]: I1001 13:27:44.026043 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7"] Oct 01 13:27:44 crc kubenswrapper[4851]: I1001 13:27:44.120590 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" event={"ID":"7966e9d5-430c-417e-9ba2-b53c598831e7","Type":"ContainerStarted","Data":"87f6a1f56f9ec1a63cd5caae83749998d9f784c3f47b3c9acddfe699d3ce8988"} Oct 01 13:27:45 crc kubenswrapper[4851]: I1001 13:27:45.137940 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" event={"ID":"7966e9d5-430c-417e-9ba2-b53c598831e7","Type":"ContainerStarted","Data":"ca9089e883804b168e3e2713985207000dff22360de477a19b08730034685f73"} Oct 01 13:27:45 crc kubenswrapper[4851]: I1001 13:27:45.166352 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" podStartSLOduration=1.565315362 podStartE2EDuration="2.166322969s" podCreationTimestamp="2025-10-01 13:27:43 +0000 UTC" firstStartedPulling="2025-10-01 13:27:44.032276192 +0000 UTC m=+2072.377393668" lastFinishedPulling="2025-10-01 13:27:44.633283779 +0000 UTC m=+2072.978401275" observedRunningTime="2025-10-01 13:27:45.156053476 +0000 UTC m=+2073.501170992" watchObservedRunningTime="2025-10-01 13:27:45.166322969 +0000 UTC m=+2073.511440465" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.049584 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.050100 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.050152 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.051026 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2cf857fcafcc27114560c9d5ed62e037c4c2e05f3ac9187b9ce4b4a9bc35966e"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.051092 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://2cf857fcafcc27114560c9d5ed62e037c4c2e05f3ac9187b9ce4b4a9bc35966e" gracePeriod=600 Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.341548 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="2cf857fcafcc27114560c9d5ed62e037c4c2e05f3ac9187b9ce4b4a9bc35966e" exitCode=0 Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.346065 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"2cf857fcafcc27114560c9d5ed62e037c4c2e05f3ac9187b9ce4b4a9bc35966e"} Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.346162 4851 scope.go:117] "RemoveContainer" containerID="72b6928ab01863d237fd3464819e9288a0d89d58ac95977e1aafa749b406ffe7" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.585849 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cqtfx"] Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.588071 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.628668 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cqtfx"] Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.681334 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a020becb-a047-4f6e-bce3-394f24d8bc14-catalog-content\") pod \"certified-operators-cqtfx\" (UID: \"a020becb-a047-4f6e-bce3-394f24d8bc14\") " pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.681678 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a020becb-a047-4f6e-bce3-394f24d8bc14-utilities\") pod \"certified-operators-cqtfx\" (UID: \"a020becb-a047-4f6e-bce3-394f24d8bc14\") " pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.681708 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5q5s\" (UniqueName: \"kubernetes.io/projected/a020becb-a047-4f6e-bce3-394f24d8bc14-kube-api-access-k5q5s\") pod \"certified-operators-cqtfx\" (UID: \"a020becb-a047-4f6e-bce3-394f24d8bc14\") " pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.783247 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a020becb-a047-4f6e-bce3-394f24d8bc14-utilities\") pod \"certified-operators-cqtfx\" (UID: \"a020becb-a047-4f6e-bce3-394f24d8bc14\") " pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.783291 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5q5s\" (UniqueName: \"kubernetes.io/projected/a020becb-a047-4f6e-bce3-394f24d8bc14-kube-api-access-k5q5s\") pod \"certified-operators-cqtfx\" (UID: \"a020becb-a047-4f6e-bce3-394f24d8bc14\") " pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.783442 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a020becb-a047-4f6e-bce3-394f24d8bc14-catalog-content\") pod \"certified-operators-cqtfx\" (UID: \"a020becb-a047-4f6e-bce3-394f24d8bc14\") " pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.783737 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a020becb-a047-4f6e-bce3-394f24d8bc14-utilities\") pod \"certified-operators-cqtfx\" (UID: \"a020becb-a047-4f6e-bce3-394f24d8bc14\") " pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.784030 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a020becb-a047-4f6e-bce3-394f24d8bc14-catalog-content\") pod \"certified-operators-cqtfx\" (UID: \"a020becb-a047-4f6e-bce3-394f24d8bc14\") " pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.804433 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5q5s\" (UniqueName: \"kubernetes.io/projected/a020becb-a047-4f6e-bce3-394f24d8bc14-kube-api-access-k5q5s\") pod \"certified-operators-cqtfx\" (UID: \"a020becb-a047-4f6e-bce3-394f24d8bc14\") " pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:00 crc kubenswrapper[4851]: I1001 13:28:00.926665 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:01 crc kubenswrapper[4851]: I1001 13:28:01.352003 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb"} Oct 01 13:28:01 crc kubenswrapper[4851]: I1001 13:28:01.429129 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cqtfx"] Oct 01 13:28:02 crc kubenswrapper[4851]: I1001 13:28:02.362291 4851 generic.go:334] "Generic (PLEG): container finished" podID="a020becb-a047-4f6e-bce3-394f24d8bc14" containerID="e680b090092546d18f49582c6acd7fa8c84bf7267992872fe27898a6c79f394d" exitCode=0 Oct 01 13:28:02 crc kubenswrapper[4851]: I1001 13:28:02.362427 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqtfx" event={"ID":"a020becb-a047-4f6e-bce3-394f24d8bc14","Type":"ContainerDied","Data":"e680b090092546d18f49582c6acd7fa8c84bf7267992872fe27898a6c79f394d"} Oct 01 13:28:02 crc kubenswrapper[4851]: I1001 13:28:02.362699 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqtfx" event={"ID":"a020becb-a047-4f6e-bce3-394f24d8bc14","Type":"ContainerStarted","Data":"39afc7a72ca48788cbe0d85baf00f10d2d488dd40cfcb343fbec45faa57bdcbd"} Oct 01 13:28:03 crc kubenswrapper[4851]: I1001 13:28:03.378022 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqtfx" event={"ID":"a020becb-a047-4f6e-bce3-394f24d8bc14","Type":"ContainerStarted","Data":"d8bb913a9171dc3b471385ed4a7a4f91fd0282d033e24b6a233a301f9d7050ee"} Oct 01 13:28:04 crc kubenswrapper[4851]: I1001 13:28:04.389363 4851 generic.go:334] "Generic (PLEG): container finished" podID="a020becb-a047-4f6e-bce3-394f24d8bc14" containerID="d8bb913a9171dc3b471385ed4a7a4f91fd0282d033e24b6a233a301f9d7050ee" exitCode=0 Oct 01 13:28:04 crc kubenswrapper[4851]: I1001 13:28:04.389437 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqtfx" event={"ID":"a020becb-a047-4f6e-bce3-394f24d8bc14","Type":"ContainerDied","Data":"d8bb913a9171dc3b471385ed4a7a4f91fd0282d033e24b6a233a301f9d7050ee"} Oct 01 13:28:05 crc kubenswrapper[4851]: I1001 13:28:05.401359 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqtfx" event={"ID":"a020becb-a047-4f6e-bce3-394f24d8bc14","Type":"ContainerStarted","Data":"0279fc3a628b9a3e270f6482e9dfb20e7da6af1b9b999d65bfc80b6db8f3a0a5"} Oct 01 13:28:05 crc kubenswrapper[4851]: I1001 13:28:05.424427 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cqtfx" podStartSLOduration=2.865357717 podStartE2EDuration="5.424410367s" podCreationTimestamp="2025-10-01 13:28:00 +0000 UTC" firstStartedPulling="2025-10-01 13:28:02.364151595 +0000 UTC m=+2090.709269101" lastFinishedPulling="2025-10-01 13:28:04.923204255 +0000 UTC m=+2093.268321751" observedRunningTime="2025-10-01 13:28:05.419128496 +0000 UTC m=+2093.764245982" watchObservedRunningTime="2025-10-01 13:28:05.424410367 +0000 UTC m=+2093.769527843" Oct 01 13:28:10 crc kubenswrapper[4851]: I1001 13:28:10.928012 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:10 crc kubenswrapper[4851]: I1001 13:28:10.928779 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:10 crc kubenswrapper[4851]: I1001 13:28:10.982354 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:11 crc kubenswrapper[4851]: I1001 13:28:11.596196 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:11 crc kubenswrapper[4851]: I1001 13:28:11.652533 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cqtfx"] Oct 01 13:28:13 crc kubenswrapper[4851]: I1001 13:28:13.525237 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cqtfx" podUID="a020becb-a047-4f6e-bce3-394f24d8bc14" containerName="registry-server" containerID="cri-o://0279fc3a628b9a3e270f6482e9dfb20e7da6af1b9b999d65bfc80b6db8f3a0a5" gracePeriod=2 Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.040131 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.227317 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a020becb-a047-4f6e-bce3-394f24d8bc14-utilities\") pod \"a020becb-a047-4f6e-bce3-394f24d8bc14\" (UID: \"a020becb-a047-4f6e-bce3-394f24d8bc14\") " Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.227459 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5q5s\" (UniqueName: \"kubernetes.io/projected/a020becb-a047-4f6e-bce3-394f24d8bc14-kube-api-access-k5q5s\") pod \"a020becb-a047-4f6e-bce3-394f24d8bc14\" (UID: \"a020becb-a047-4f6e-bce3-394f24d8bc14\") " Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.227597 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a020becb-a047-4f6e-bce3-394f24d8bc14-catalog-content\") pod \"a020becb-a047-4f6e-bce3-394f24d8bc14\" (UID: \"a020becb-a047-4f6e-bce3-394f24d8bc14\") " Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.228333 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a020becb-a047-4f6e-bce3-394f24d8bc14-utilities" (OuterVolumeSpecName: "utilities") pod "a020becb-a047-4f6e-bce3-394f24d8bc14" (UID: "a020becb-a047-4f6e-bce3-394f24d8bc14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.236131 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a020becb-a047-4f6e-bce3-394f24d8bc14-kube-api-access-k5q5s" (OuterVolumeSpecName: "kube-api-access-k5q5s") pod "a020becb-a047-4f6e-bce3-394f24d8bc14" (UID: "a020becb-a047-4f6e-bce3-394f24d8bc14"). InnerVolumeSpecName "kube-api-access-k5q5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.330930 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a020becb-a047-4f6e-bce3-394f24d8bc14-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.330965 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5q5s\" (UniqueName: \"kubernetes.io/projected/a020becb-a047-4f6e-bce3-394f24d8bc14-kube-api-access-k5q5s\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.542001 4851 generic.go:334] "Generic (PLEG): container finished" podID="a020becb-a047-4f6e-bce3-394f24d8bc14" containerID="0279fc3a628b9a3e270f6482e9dfb20e7da6af1b9b999d65bfc80b6db8f3a0a5" exitCode=0 Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.542060 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqtfx" event={"ID":"a020becb-a047-4f6e-bce3-394f24d8bc14","Type":"ContainerDied","Data":"0279fc3a628b9a3e270f6482e9dfb20e7da6af1b9b999d65bfc80b6db8f3a0a5"} Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.542098 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqtfx" event={"ID":"a020becb-a047-4f6e-bce3-394f24d8bc14","Type":"ContainerDied","Data":"39afc7a72ca48788cbe0d85baf00f10d2d488dd40cfcb343fbec45faa57bdcbd"} Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.542127 4851 scope.go:117] "RemoveContainer" containerID="0279fc3a628b9a3e270f6482e9dfb20e7da6af1b9b999d65bfc80b6db8f3a0a5" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.542488 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqtfx" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.558839 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a020becb-a047-4f6e-bce3-394f24d8bc14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a020becb-a047-4f6e-bce3-394f24d8bc14" (UID: "a020becb-a047-4f6e-bce3-394f24d8bc14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.569699 4851 scope.go:117] "RemoveContainer" containerID="d8bb913a9171dc3b471385ed4a7a4f91fd0282d033e24b6a233a301f9d7050ee" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.614249 4851 scope.go:117] "RemoveContainer" containerID="e680b090092546d18f49582c6acd7fa8c84bf7267992872fe27898a6c79f394d" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.636969 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a020becb-a047-4f6e-bce3-394f24d8bc14-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.673911 4851 scope.go:117] "RemoveContainer" containerID="0279fc3a628b9a3e270f6482e9dfb20e7da6af1b9b999d65bfc80b6db8f3a0a5" Oct 01 13:28:14 crc kubenswrapper[4851]: E1001 13:28:14.674473 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0279fc3a628b9a3e270f6482e9dfb20e7da6af1b9b999d65bfc80b6db8f3a0a5\": container with ID starting with 0279fc3a628b9a3e270f6482e9dfb20e7da6af1b9b999d65bfc80b6db8f3a0a5 not found: ID does not exist" containerID="0279fc3a628b9a3e270f6482e9dfb20e7da6af1b9b999d65bfc80b6db8f3a0a5" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.674533 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0279fc3a628b9a3e270f6482e9dfb20e7da6af1b9b999d65bfc80b6db8f3a0a5"} err="failed to get container status \"0279fc3a628b9a3e270f6482e9dfb20e7da6af1b9b999d65bfc80b6db8f3a0a5\": rpc error: code = NotFound desc = could not find container \"0279fc3a628b9a3e270f6482e9dfb20e7da6af1b9b999d65bfc80b6db8f3a0a5\": container with ID starting with 0279fc3a628b9a3e270f6482e9dfb20e7da6af1b9b999d65bfc80b6db8f3a0a5 not found: ID does not exist" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.674559 4851 scope.go:117] "RemoveContainer" containerID="d8bb913a9171dc3b471385ed4a7a4f91fd0282d033e24b6a233a301f9d7050ee" Oct 01 13:28:14 crc kubenswrapper[4851]: E1001 13:28:14.674977 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8bb913a9171dc3b471385ed4a7a4f91fd0282d033e24b6a233a301f9d7050ee\": container with ID starting with d8bb913a9171dc3b471385ed4a7a4f91fd0282d033e24b6a233a301f9d7050ee not found: ID does not exist" containerID="d8bb913a9171dc3b471385ed4a7a4f91fd0282d033e24b6a233a301f9d7050ee" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.675033 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8bb913a9171dc3b471385ed4a7a4f91fd0282d033e24b6a233a301f9d7050ee"} err="failed to get container status \"d8bb913a9171dc3b471385ed4a7a4f91fd0282d033e24b6a233a301f9d7050ee\": rpc error: code = NotFound desc = could not find container \"d8bb913a9171dc3b471385ed4a7a4f91fd0282d033e24b6a233a301f9d7050ee\": container with ID starting with d8bb913a9171dc3b471385ed4a7a4f91fd0282d033e24b6a233a301f9d7050ee not found: ID does not exist" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.675070 4851 scope.go:117] "RemoveContainer" containerID="e680b090092546d18f49582c6acd7fa8c84bf7267992872fe27898a6c79f394d" Oct 01 13:28:14 crc kubenswrapper[4851]: E1001 13:28:14.675473 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e680b090092546d18f49582c6acd7fa8c84bf7267992872fe27898a6c79f394d\": container with ID starting with e680b090092546d18f49582c6acd7fa8c84bf7267992872fe27898a6c79f394d not found: ID does not exist" containerID="e680b090092546d18f49582c6acd7fa8c84bf7267992872fe27898a6c79f394d" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.675520 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e680b090092546d18f49582c6acd7fa8c84bf7267992872fe27898a6c79f394d"} err="failed to get container status \"e680b090092546d18f49582c6acd7fa8c84bf7267992872fe27898a6c79f394d\": rpc error: code = NotFound desc = could not find container \"e680b090092546d18f49582c6acd7fa8c84bf7267992872fe27898a6c79f394d\": container with ID starting with e680b090092546d18f49582c6acd7fa8c84bf7267992872fe27898a6c79f394d not found: ID does not exist" Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.890546 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cqtfx"] Oct 01 13:28:14 crc kubenswrapper[4851]: I1001 13:28:14.902217 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cqtfx"] Oct 01 13:28:16 crc kubenswrapper[4851]: I1001 13:28:16.341538 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a020becb-a047-4f6e-bce3-394f24d8bc14" path="/var/lib/kubelet/pods/a020becb-a047-4f6e-bce3-394f24d8bc14/volumes" Oct 01 13:28:39 crc kubenswrapper[4851]: I1001 13:28:39.809762 4851 generic.go:334] "Generic (PLEG): container finished" podID="7966e9d5-430c-417e-9ba2-b53c598831e7" containerID="ca9089e883804b168e3e2713985207000dff22360de477a19b08730034685f73" exitCode=0 Oct 01 13:28:39 crc kubenswrapper[4851]: I1001 13:28:39.809826 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" event={"ID":"7966e9d5-430c-417e-9ba2-b53c598831e7","Type":"ContainerDied","Data":"ca9089e883804b168e3e2713985207000dff22360de477a19b08730034685f73"} Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.323234 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.421824 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7966e9d5-430c-417e-9ba2-b53c598831e7-ssh-key\") pod \"7966e9d5-430c-417e-9ba2-b53c598831e7\" (UID: \"7966e9d5-430c-417e-9ba2-b53c598831e7\") " Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.421974 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7966e9d5-430c-417e-9ba2-b53c598831e7-inventory\") pod \"7966e9d5-430c-417e-9ba2-b53c598831e7\" (UID: \"7966e9d5-430c-417e-9ba2-b53c598831e7\") " Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.422202 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s7xg\" (UniqueName: \"kubernetes.io/projected/7966e9d5-430c-417e-9ba2-b53c598831e7-kube-api-access-2s7xg\") pod \"7966e9d5-430c-417e-9ba2-b53c598831e7\" (UID: \"7966e9d5-430c-417e-9ba2-b53c598831e7\") " Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.429751 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7966e9d5-430c-417e-9ba2-b53c598831e7-kube-api-access-2s7xg" (OuterVolumeSpecName: "kube-api-access-2s7xg") pod "7966e9d5-430c-417e-9ba2-b53c598831e7" (UID: "7966e9d5-430c-417e-9ba2-b53c598831e7"). InnerVolumeSpecName "kube-api-access-2s7xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.457302 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7966e9d5-430c-417e-9ba2-b53c598831e7-inventory" (OuterVolumeSpecName: "inventory") pod "7966e9d5-430c-417e-9ba2-b53c598831e7" (UID: "7966e9d5-430c-417e-9ba2-b53c598831e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.467928 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7966e9d5-430c-417e-9ba2-b53c598831e7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7966e9d5-430c-417e-9ba2-b53c598831e7" (UID: "7966e9d5-430c-417e-9ba2-b53c598831e7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.525100 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s7xg\" (UniqueName: \"kubernetes.io/projected/7966e9d5-430c-417e-9ba2-b53c598831e7-kube-api-access-2s7xg\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.525557 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7966e9d5-430c-417e-9ba2-b53c598831e7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.525671 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7966e9d5-430c-417e-9ba2-b53c598831e7-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.841986 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" event={"ID":"7966e9d5-430c-417e-9ba2-b53c598831e7","Type":"ContainerDied","Data":"87f6a1f56f9ec1a63cd5caae83749998d9f784c3f47b3c9acddfe699d3ce8988"} Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.842065 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87f6a1f56f9ec1a63cd5caae83749998d9f784c3f47b3c9acddfe699d3ce8988" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.842097 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-khbc7" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.943717 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2cbdw"] Oct 01 13:28:41 crc kubenswrapper[4851]: E1001 13:28:41.944178 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a020becb-a047-4f6e-bce3-394f24d8bc14" containerName="extract-content" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.944203 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="a020becb-a047-4f6e-bce3-394f24d8bc14" containerName="extract-content" Oct 01 13:28:41 crc kubenswrapper[4851]: E1001 13:28:41.944228 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7966e9d5-430c-417e-9ba2-b53c598831e7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.944239 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="7966e9d5-430c-417e-9ba2-b53c598831e7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:28:41 crc kubenswrapper[4851]: E1001 13:28:41.944252 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a020becb-a047-4f6e-bce3-394f24d8bc14" containerName="extract-utilities" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.944260 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="a020becb-a047-4f6e-bce3-394f24d8bc14" containerName="extract-utilities" Oct 01 13:28:41 crc kubenswrapper[4851]: E1001 13:28:41.944275 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a020becb-a047-4f6e-bce3-394f24d8bc14" containerName="registry-server" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.944282 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="a020becb-a047-4f6e-bce3-394f24d8bc14" containerName="registry-server" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.944592 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="a020becb-a047-4f6e-bce3-394f24d8bc14" containerName="registry-server" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.944623 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="7966e9d5-430c-417e-9ba2-b53c598831e7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.945350 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.950200 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.950477 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.950706 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.950886 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:28:41 crc kubenswrapper[4851]: I1001 13:28:41.953273 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2cbdw"] Oct 01 13:28:42 crc kubenswrapper[4851]: I1001 13:28:42.049295 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2b5h\" (UniqueName: \"kubernetes.io/projected/f4295079-cbbd-4349-b20b-ce68008d76e9-kube-api-access-d2b5h\") pod \"ssh-known-hosts-edpm-deployment-2cbdw\" (UID: \"f4295079-cbbd-4349-b20b-ce68008d76e9\") " pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" Oct 01 13:28:42 crc kubenswrapper[4851]: I1001 13:28:42.049542 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4295079-cbbd-4349-b20b-ce68008d76e9-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2cbdw\" (UID: \"f4295079-cbbd-4349-b20b-ce68008d76e9\") " pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" Oct 01 13:28:42 crc kubenswrapper[4851]: I1001 13:28:42.049600 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f4295079-cbbd-4349-b20b-ce68008d76e9-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2cbdw\" (UID: \"f4295079-cbbd-4349-b20b-ce68008d76e9\") " pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" Oct 01 13:28:42 crc kubenswrapper[4851]: I1001 13:28:42.151013 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4295079-cbbd-4349-b20b-ce68008d76e9-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2cbdw\" (UID: \"f4295079-cbbd-4349-b20b-ce68008d76e9\") " pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" Oct 01 13:28:42 crc kubenswrapper[4851]: I1001 13:28:42.151093 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f4295079-cbbd-4349-b20b-ce68008d76e9-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2cbdw\" (UID: \"f4295079-cbbd-4349-b20b-ce68008d76e9\") " pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" Oct 01 13:28:42 crc kubenswrapper[4851]: I1001 13:28:42.151158 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2b5h\" (UniqueName: \"kubernetes.io/projected/f4295079-cbbd-4349-b20b-ce68008d76e9-kube-api-access-d2b5h\") pod \"ssh-known-hosts-edpm-deployment-2cbdw\" (UID: \"f4295079-cbbd-4349-b20b-ce68008d76e9\") " pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" Oct 01 13:28:42 crc kubenswrapper[4851]: I1001 13:28:42.156277 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4295079-cbbd-4349-b20b-ce68008d76e9-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2cbdw\" (UID: \"f4295079-cbbd-4349-b20b-ce68008d76e9\") " pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" Oct 01 13:28:42 crc kubenswrapper[4851]: I1001 13:28:42.156663 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f4295079-cbbd-4349-b20b-ce68008d76e9-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2cbdw\" (UID: \"f4295079-cbbd-4349-b20b-ce68008d76e9\") " pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" Oct 01 13:28:42 crc kubenswrapper[4851]: I1001 13:28:42.174145 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2b5h\" (UniqueName: \"kubernetes.io/projected/f4295079-cbbd-4349-b20b-ce68008d76e9-kube-api-access-d2b5h\") pod \"ssh-known-hosts-edpm-deployment-2cbdw\" (UID: \"f4295079-cbbd-4349-b20b-ce68008d76e9\") " pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" Oct 01 13:28:42 crc kubenswrapper[4851]: I1001 13:28:42.273459 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" Oct 01 13:28:42 crc kubenswrapper[4851]: I1001 13:28:42.888839 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2cbdw"] Oct 01 13:28:43 crc kubenswrapper[4851]: I1001 13:28:43.883971 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" event={"ID":"f4295079-cbbd-4349-b20b-ce68008d76e9","Type":"ContainerStarted","Data":"7596062582e2b94178cdcbdfbc07e405d54ce0a7f00995501d89494d0584ef44"} Oct 01 13:28:43 crc kubenswrapper[4851]: I1001 13:28:43.884462 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" event={"ID":"f4295079-cbbd-4349-b20b-ce68008d76e9","Type":"ContainerStarted","Data":"acda9883080bba6dad2072b6a6d29ccd42e6102f807619e158ee2174cce22632"} Oct 01 13:28:43 crc kubenswrapper[4851]: I1001 13:28:43.910831 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" podStartSLOduration=2.430772778 podStartE2EDuration="2.910807226s" podCreationTimestamp="2025-10-01 13:28:41 +0000 UTC" firstStartedPulling="2025-10-01 13:28:42.878996025 +0000 UTC m=+2131.224113521" lastFinishedPulling="2025-10-01 13:28:43.359030483 +0000 UTC m=+2131.704147969" observedRunningTime="2025-10-01 13:28:43.898956948 +0000 UTC m=+2132.244074504" watchObservedRunningTime="2025-10-01 13:28:43.910807226 +0000 UTC m=+2132.255924712" Oct 01 13:28:51 crc kubenswrapper[4851]: I1001 13:28:51.962043 4851 generic.go:334] "Generic (PLEG): container finished" podID="f4295079-cbbd-4349-b20b-ce68008d76e9" containerID="7596062582e2b94178cdcbdfbc07e405d54ce0a7f00995501d89494d0584ef44" exitCode=0 Oct 01 13:28:51 crc kubenswrapper[4851]: I1001 13:28:51.962144 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" event={"ID":"f4295079-cbbd-4349-b20b-ce68008d76e9","Type":"ContainerDied","Data":"7596062582e2b94178cdcbdfbc07e405d54ce0a7f00995501d89494d0584ef44"} Oct 01 13:28:53 crc kubenswrapper[4851]: I1001 13:28:53.450251 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" Oct 01 13:28:53 crc kubenswrapper[4851]: I1001 13:28:53.550398 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4295079-cbbd-4349-b20b-ce68008d76e9-ssh-key-openstack-edpm-ipam\") pod \"f4295079-cbbd-4349-b20b-ce68008d76e9\" (UID: \"f4295079-cbbd-4349-b20b-ce68008d76e9\") " Oct 01 13:28:53 crc kubenswrapper[4851]: I1001 13:28:53.550699 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f4295079-cbbd-4349-b20b-ce68008d76e9-inventory-0\") pod \"f4295079-cbbd-4349-b20b-ce68008d76e9\" (UID: \"f4295079-cbbd-4349-b20b-ce68008d76e9\") " Oct 01 13:28:53 crc kubenswrapper[4851]: I1001 13:28:53.550903 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2b5h\" (UniqueName: \"kubernetes.io/projected/f4295079-cbbd-4349-b20b-ce68008d76e9-kube-api-access-d2b5h\") pod \"f4295079-cbbd-4349-b20b-ce68008d76e9\" (UID: \"f4295079-cbbd-4349-b20b-ce68008d76e9\") " Oct 01 13:28:53 crc kubenswrapper[4851]: I1001 13:28:53.570947 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4295079-cbbd-4349-b20b-ce68008d76e9-kube-api-access-d2b5h" (OuterVolumeSpecName: "kube-api-access-d2b5h") pod "f4295079-cbbd-4349-b20b-ce68008d76e9" (UID: "f4295079-cbbd-4349-b20b-ce68008d76e9"). InnerVolumeSpecName "kube-api-access-d2b5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:53 crc kubenswrapper[4851]: I1001 13:28:53.602864 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4295079-cbbd-4349-b20b-ce68008d76e9-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "f4295079-cbbd-4349-b20b-ce68008d76e9" (UID: "f4295079-cbbd-4349-b20b-ce68008d76e9"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:53 crc kubenswrapper[4851]: I1001 13:28:53.608973 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4295079-cbbd-4349-b20b-ce68008d76e9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f4295079-cbbd-4349-b20b-ce68008d76e9" (UID: "f4295079-cbbd-4349-b20b-ce68008d76e9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:53 crc kubenswrapper[4851]: I1001 13:28:53.653559 4851 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f4295079-cbbd-4349-b20b-ce68008d76e9-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:53 crc kubenswrapper[4851]: I1001 13:28:53.653611 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2b5h\" (UniqueName: \"kubernetes.io/projected/f4295079-cbbd-4349-b20b-ce68008d76e9-kube-api-access-d2b5h\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:53 crc kubenswrapper[4851]: I1001 13:28:53.653640 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4295079-cbbd-4349-b20b-ce68008d76e9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:53 crc kubenswrapper[4851]: I1001 13:28:53.994013 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" event={"ID":"f4295079-cbbd-4349-b20b-ce68008d76e9","Type":"ContainerDied","Data":"acda9883080bba6dad2072b6a6d29ccd42e6102f807619e158ee2174cce22632"} Oct 01 13:28:53 crc kubenswrapper[4851]: I1001 13:28:53.994088 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acda9883080bba6dad2072b6a6d29ccd42e6102f807619e158ee2174cce22632" Oct 01 13:28:53 crc kubenswrapper[4851]: I1001 13:28:53.994185 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2cbdw" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.125586 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8"] Oct 01 13:28:54 crc kubenswrapper[4851]: E1001 13:28:54.126003 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4295079-cbbd-4349-b20b-ce68008d76e9" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.126024 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4295079-cbbd-4349-b20b-ce68008d76e9" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.126320 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4295079-cbbd-4349-b20b-ce68008d76e9" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.127136 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.135861 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.136109 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.136233 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.136190 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.141560 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8"] Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.268976 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgqmv\" (UniqueName: \"kubernetes.io/projected/75fea0b9-3dcb-4301-835b-346cfe0d09d7-kube-api-access-mgqmv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qntf8\" (UID: \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.269041 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75fea0b9-3dcb-4301-835b-346cfe0d09d7-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qntf8\" (UID: \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.269080 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75fea0b9-3dcb-4301-835b-346cfe0d09d7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qntf8\" (UID: \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.371276 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgqmv\" (UniqueName: \"kubernetes.io/projected/75fea0b9-3dcb-4301-835b-346cfe0d09d7-kube-api-access-mgqmv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qntf8\" (UID: \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.371355 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75fea0b9-3dcb-4301-835b-346cfe0d09d7-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qntf8\" (UID: \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.371399 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75fea0b9-3dcb-4301-835b-346cfe0d09d7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qntf8\" (UID: \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.377759 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75fea0b9-3dcb-4301-835b-346cfe0d09d7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qntf8\" (UID: \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.385488 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75fea0b9-3dcb-4301-835b-346cfe0d09d7-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qntf8\" (UID: \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.401099 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgqmv\" (UniqueName: \"kubernetes.io/projected/75fea0b9-3dcb-4301-835b-346cfe0d09d7-kube-api-access-mgqmv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qntf8\" (UID: \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.459310 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" Oct 01 13:28:54 crc kubenswrapper[4851]: I1001 13:28:54.823070 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8"] Oct 01 13:28:55 crc kubenswrapper[4851]: I1001 13:28:55.006911 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" event={"ID":"75fea0b9-3dcb-4301-835b-346cfe0d09d7","Type":"ContainerStarted","Data":"7dfb4f70aa07a669d0b084d93d66a358c744b8e161784d6402bc1a14da99427e"} Oct 01 13:28:56 crc kubenswrapper[4851]: I1001 13:28:56.024247 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" event={"ID":"75fea0b9-3dcb-4301-835b-346cfe0d09d7","Type":"ContainerStarted","Data":"7a03c06d4348e51102dcec53d69c563698232ff6cecf11ade4a8ce72325cc908"} Oct 01 13:28:56 crc kubenswrapper[4851]: I1001 13:28:56.043966 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" podStartSLOduration=1.470692506 podStartE2EDuration="2.043943983s" podCreationTimestamp="2025-10-01 13:28:54 +0000 UTC" firstStartedPulling="2025-10-01 13:28:54.825860157 +0000 UTC m=+2143.170977643" lastFinishedPulling="2025-10-01 13:28:55.399111614 +0000 UTC m=+2143.744229120" observedRunningTime="2025-10-01 13:28:56.039810775 +0000 UTC m=+2144.384928261" watchObservedRunningTime="2025-10-01 13:28:56.043943983 +0000 UTC m=+2144.389061509" Oct 01 13:29:06 crc kubenswrapper[4851]: I1001 13:29:06.129157 4851 generic.go:334] "Generic (PLEG): container finished" podID="75fea0b9-3dcb-4301-835b-346cfe0d09d7" containerID="7a03c06d4348e51102dcec53d69c563698232ff6cecf11ade4a8ce72325cc908" exitCode=0 Oct 01 13:29:06 crc kubenswrapper[4851]: I1001 13:29:06.129246 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" event={"ID":"75fea0b9-3dcb-4301-835b-346cfe0d09d7","Type":"ContainerDied","Data":"7a03c06d4348e51102dcec53d69c563698232ff6cecf11ade4a8ce72325cc908"} Oct 01 13:29:07 crc kubenswrapper[4851]: I1001 13:29:07.675558 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" Oct 01 13:29:07 crc kubenswrapper[4851]: I1001 13:29:07.763999 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75fea0b9-3dcb-4301-835b-346cfe0d09d7-inventory\") pod \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\" (UID: \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\") " Oct 01 13:29:07 crc kubenswrapper[4851]: I1001 13:29:07.764074 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75fea0b9-3dcb-4301-835b-346cfe0d09d7-ssh-key\") pod \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\" (UID: \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\") " Oct 01 13:29:07 crc kubenswrapper[4851]: I1001 13:29:07.764250 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgqmv\" (UniqueName: \"kubernetes.io/projected/75fea0b9-3dcb-4301-835b-346cfe0d09d7-kube-api-access-mgqmv\") pod \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\" (UID: \"75fea0b9-3dcb-4301-835b-346cfe0d09d7\") " Oct 01 13:29:07 crc kubenswrapper[4851]: I1001 13:29:07.778741 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75fea0b9-3dcb-4301-835b-346cfe0d09d7-kube-api-access-mgqmv" (OuterVolumeSpecName: "kube-api-access-mgqmv") pod "75fea0b9-3dcb-4301-835b-346cfe0d09d7" (UID: "75fea0b9-3dcb-4301-835b-346cfe0d09d7"). InnerVolumeSpecName "kube-api-access-mgqmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:29:07 crc kubenswrapper[4851]: I1001 13:29:07.799046 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75fea0b9-3dcb-4301-835b-346cfe0d09d7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "75fea0b9-3dcb-4301-835b-346cfe0d09d7" (UID: "75fea0b9-3dcb-4301-835b-346cfe0d09d7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:29:07 crc kubenswrapper[4851]: I1001 13:29:07.803659 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75fea0b9-3dcb-4301-835b-346cfe0d09d7-inventory" (OuterVolumeSpecName: "inventory") pod "75fea0b9-3dcb-4301-835b-346cfe0d09d7" (UID: "75fea0b9-3dcb-4301-835b-346cfe0d09d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:29:07 crc kubenswrapper[4851]: I1001 13:29:07.866595 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75fea0b9-3dcb-4301-835b-346cfe0d09d7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:07 crc kubenswrapper[4851]: I1001 13:29:07.866638 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75fea0b9-3dcb-4301-835b-346cfe0d09d7-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:07 crc kubenswrapper[4851]: I1001 13:29:07.866655 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgqmv\" (UniqueName: \"kubernetes.io/projected/75fea0b9-3dcb-4301-835b-346cfe0d09d7-kube-api-access-mgqmv\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.161396 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" event={"ID":"75fea0b9-3dcb-4301-835b-346cfe0d09d7","Type":"ContainerDied","Data":"7dfb4f70aa07a669d0b084d93d66a358c744b8e161784d6402bc1a14da99427e"} Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.161450 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dfb4f70aa07a669d0b084d93d66a358c744b8e161784d6402bc1a14da99427e" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.161563 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qntf8" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.267264 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h"] Oct 01 13:29:08 crc kubenswrapper[4851]: E1001 13:29:08.268325 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75fea0b9-3dcb-4301-835b-346cfe0d09d7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.268349 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="75fea0b9-3dcb-4301-835b-346cfe0d09d7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.268709 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="75fea0b9-3dcb-4301-835b-346cfe0d09d7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.269434 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.276013 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.276249 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.276437 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.276617 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.304202 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h"] Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.379586 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6946a3f7-315a-48df-8a02-af0fee0d1fce-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h\" (UID: \"6946a3f7-315a-48df-8a02-af0fee0d1fce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.379651 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6946a3f7-315a-48df-8a02-af0fee0d1fce-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h\" (UID: \"6946a3f7-315a-48df-8a02-af0fee0d1fce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.379845 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnsbt\" (UniqueName: \"kubernetes.io/projected/6946a3f7-315a-48df-8a02-af0fee0d1fce-kube-api-access-jnsbt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h\" (UID: \"6946a3f7-315a-48df-8a02-af0fee0d1fce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.482207 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6946a3f7-315a-48df-8a02-af0fee0d1fce-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h\" (UID: \"6946a3f7-315a-48df-8a02-af0fee0d1fce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.482253 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6946a3f7-315a-48df-8a02-af0fee0d1fce-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h\" (UID: \"6946a3f7-315a-48df-8a02-af0fee0d1fce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.482347 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnsbt\" (UniqueName: \"kubernetes.io/projected/6946a3f7-315a-48df-8a02-af0fee0d1fce-kube-api-access-jnsbt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h\" (UID: \"6946a3f7-315a-48df-8a02-af0fee0d1fce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.489193 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6946a3f7-315a-48df-8a02-af0fee0d1fce-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h\" (UID: \"6946a3f7-315a-48df-8a02-af0fee0d1fce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.489781 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6946a3f7-315a-48df-8a02-af0fee0d1fce-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h\" (UID: \"6946a3f7-315a-48df-8a02-af0fee0d1fce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.512106 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnsbt\" (UniqueName: \"kubernetes.io/projected/6946a3f7-315a-48df-8a02-af0fee0d1fce-kube-api-access-jnsbt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h\" (UID: \"6946a3f7-315a-48df-8a02-af0fee0d1fce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" Oct 01 13:29:08 crc kubenswrapper[4851]: I1001 13:29:08.607794 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" Oct 01 13:29:09 crc kubenswrapper[4851]: I1001 13:29:09.163570 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h"] Oct 01 13:29:09 crc kubenswrapper[4851]: I1001 13:29:09.170621 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:29:10 crc kubenswrapper[4851]: I1001 13:29:10.198831 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" event={"ID":"6946a3f7-315a-48df-8a02-af0fee0d1fce","Type":"ContainerStarted","Data":"e5fff9ed00ff85b0420b1dff112ec0de69a0a01111f2e649b9301918d8f7abe0"} Oct 01 13:29:11 crc kubenswrapper[4851]: I1001 13:29:11.213977 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" event={"ID":"6946a3f7-315a-48df-8a02-af0fee0d1fce","Type":"ContainerStarted","Data":"f280460d7991e48781733d4445ef0fbe551d739c5c66031caf44f59c44b20303"} Oct 01 13:29:11 crc kubenswrapper[4851]: I1001 13:29:11.243491 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" podStartSLOduration=2.218431914 podStartE2EDuration="3.243468591s" podCreationTimestamp="2025-10-01 13:29:08 +0000 UTC" firstStartedPulling="2025-10-01 13:29:09.170357569 +0000 UTC m=+2157.515475065" lastFinishedPulling="2025-10-01 13:29:10.195394246 +0000 UTC m=+2158.540511742" observedRunningTime="2025-10-01 13:29:11.233807926 +0000 UTC m=+2159.578925422" watchObservedRunningTime="2025-10-01 13:29:11.243468591 +0000 UTC m=+2159.588586097" Oct 01 13:29:16 crc kubenswrapper[4851]: I1001 13:29:16.400357 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5wm6q"] Oct 01 13:29:16 crc kubenswrapper[4851]: I1001 13:29:16.404400 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:16 crc kubenswrapper[4851]: I1001 13:29:16.417657 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5wm6q"] Oct 01 13:29:16 crc kubenswrapper[4851]: I1001 13:29:16.510867 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-utilities\") pod \"redhat-operators-5wm6q\" (UID: \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\") " pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:16 crc kubenswrapper[4851]: I1001 13:29:16.511051 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-catalog-content\") pod \"redhat-operators-5wm6q\" (UID: \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\") " pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:16 crc kubenswrapper[4851]: I1001 13:29:16.511083 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mkvh\" (UniqueName: \"kubernetes.io/projected/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-kube-api-access-5mkvh\") pod \"redhat-operators-5wm6q\" (UID: \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\") " pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:16 crc kubenswrapper[4851]: I1001 13:29:16.613120 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-catalog-content\") pod \"redhat-operators-5wm6q\" (UID: \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\") " pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:16 crc kubenswrapper[4851]: I1001 13:29:16.613184 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mkvh\" (UniqueName: \"kubernetes.io/projected/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-kube-api-access-5mkvh\") pod \"redhat-operators-5wm6q\" (UID: \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\") " pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:16 crc kubenswrapper[4851]: I1001 13:29:16.613371 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-utilities\") pod \"redhat-operators-5wm6q\" (UID: \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\") " pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:16 crc kubenswrapper[4851]: I1001 13:29:16.614001 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-catalog-content\") pod \"redhat-operators-5wm6q\" (UID: \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\") " pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:16 crc kubenswrapper[4851]: I1001 13:29:16.614015 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-utilities\") pod \"redhat-operators-5wm6q\" (UID: \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\") " pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:16 crc kubenswrapper[4851]: I1001 13:29:16.643407 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mkvh\" (UniqueName: \"kubernetes.io/projected/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-kube-api-access-5mkvh\") pod \"redhat-operators-5wm6q\" (UID: \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\") " pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:16 crc kubenswrapper[4851]: I1001 13:29:16.738957 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:17 crc kubenswrapper[4851]: I1001 13:29:17.223182 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5wm6q"] Oct 01 13:29:17 crc kubenswrapper[4851]: I1001 13:29:17.316358 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wm6q" event={"ID":"86900bcc-74ed-423f-bae9-a9b9dcb9f93c","Type":"ContainerStarted","Data":"9ee118d399cddaeae033eddf197ff181a5841db8b15e79a23e342c51a36a62d7"} Oct 01 13:29:18 crc kubenswrapper[4851]: I1001 13:29:18.329067 4851 generic.go:334] "Generic (PLEG): container finished" podID="86900bcc-74ed-423f-bae9-a9b9dcb9f93c" containerID="244d01476244d12196c818118cebc6506d5d89b6a5d91343da2af5fd1c07a141" exitCode=0 Oct 01 13:29:18 crc kubenswrapper[4851]: I1001 13:29:18.344348 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wm6q" event={"ID":"86900bcc-74ed-423f-bae9-a9b9dcb9f93c","Type":"ContainerDied","Data":"244d01476244d12196c818118cebc6506d5d89b6a5d91343da2af5fd1c07a141"} Oct 01 13:29:20 crc kubenswrapper[4851]: I1001 13:29:20.356530 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wm6q" event={"ID":"86900bcc-74ed-423f-bae9-a9b9dcb9f93c","Type":"ContainerStarted","Data":"c0fb02ef24d3ac04aa45e3cd5fb4a77c610ef1414f98f199b8b6ce6057de547a"} Oct 01 13:29:21 crc kubenswrapper[4851]: I1001 13:29:21.372284 4851 generic.go:334] "Generic (PLEG): container finished" podID="6946a3f7-315a-48df-8a02-af0fee0d1fce" containerID="f280460d7991e48781733d4445ef0fbe551d739c5c66031caf44f59c44b20303" exitCode=0 Oct 01 13:29:21 crc kubenswrapper[4851]: I1001 13:29:21.372407 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" event={"ID":"6946a3f7-315a-48df-8a02-af0fee0d1fce","Type":"ContainerDied","Data":"f280460d7991e48781733d4445ef0fbe551d739c5c66031caf44f59c44b20303"} Oct 01 13:29:22 crc kubenswrapper[4851]: I1001 13:29:22.831604 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" Oct 01 13:29:22 crc kubenswrapper[4851]: I1001 13:29:22.861197 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6946a3f7-315a-48df-8a02-af0fee0d1fce-ssh-key\") pod \"6946a3f7-315a-48df-8a02-af0fee0d1fce\" (UID: \"6946a3f7-315a-48df-8a02-af0fee0d1fce\") " Oct 01 13:29:22 crc kubenswrapper[4851]: I1001 13:29:22.861269 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6946a3f7-315a-48df-8a02-af0fee0d1fce-inventory\") pod \"6946a3f7-315a-48df-8a02-af0fee0d1fce\" (UID: \"6946a3f7-315a-48df-8a02-af0fee0d1fce\") " Oct 01 13:29:22 crc kubenswrapper[4851]: I1001 13:29:22.861390 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnsbt\" (UniqueName: \"kubernetes.io/projected/6946a3f7-315a-48df-8a02-af0fee0d1fce-kube-api-access-jnsbt\") pod \"6946a3f7-315a-48df-8a02-af0fee0d1fce\" (UID: \"6946a3f7-315a-48df-8a02-af0fee0d1fce\") " Oct 01 13:29:22 crc kubenswrapper[4851]: I1001 13:29:22.867220 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6946a3f7-315a-48df-8a02-af0fee0d1fce-kube-api-access-jnsbt" (OuterVolumeSpecName: "kube-api-access-jnsbt") pod "6946a3f7-315a-48df-8a02-af0fee0d1fce" (UID: "6946a3f7-315a-48df-8a02-af0fee0d1fce"). InnerVolumeSpecName "kube-api-access-jnsbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:29:22 crc kubenswrapper[4851]: I1001 13:29:22.890761 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6946a3f7-315a-48df-8a02-af0fee0d1fce-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6946a3f7-315a-48df-8a02-af0fee0d1fce" (UID: "6946a3f7-315a-48df-8a02-af0fee0d1fce"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:29:22 crc kubenswrapper[4851]: I1001 13:29:22.894098 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6946a3f7-315a-48df-8a02-af0fee0d1fce-inventory" (OuterVolumeSpecName: "inventory") pod "6946a3f7-315a-48df-8a02-af0fee0d1fce" (UID: "6946a3f7-315a-48df-8a02-af0fee0d1fce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:29:22 crc kubenswrapper[4851]: I1001 13:29:22.963950 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6946a3f7-315a-48df-8a02-af0fee0d1fce-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:22 crc kubenswrapper[4851]: I1001 13:29:22.964002 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnsbt\" (UniqueName: \"kubernetes.io/projected/6946a3f7-315a-48df-8a02-af0fee0d1fce-kube-api-access-jnsbt\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:22 crc kubenswrapper[4851]: I1001 13:29:22.964023 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6946a3f7-315a-48df-8a02-af0fee0d1fce-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.400987 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" event={"ID":"6946a3f7-315a-48df-8a02-af0fee0d1fce","Type":"ContainerDied","Data":"e5fff9ed00ff85b0420b1dff112ec0de69a0a01111f2e649b9301918d8f7abe0"} Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.401382 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5fff9ed00ff85b0420b1dff112ec0de69a0a01111f2e649b9301918d8f7abe0" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.401043 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.483781 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn"] Oct 01 13:29:23 crc kubenswrapper[4851]: E1001 13:29:23.484358 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6946a3f7-315a-48df-8a02-af0fee0d1fce" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.484386 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="6946a3f7-315a-48df-8a02-af0fee0d1fce" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.484688 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="6946a3f7-315a-48df-8a02-af0fee0d1fce" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.485781 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.488395 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.488676 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.488732 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.491244 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.491973 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.493243 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.493435 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.493895 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.502455 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn"] Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.578139 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.578201 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.578268 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.578301 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.578345 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.578536 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.578601 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.578664 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.578706 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.578746 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.578778 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmxbj\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-kube-api-access-lmxbj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.578813 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.578867 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.578913 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.680400 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.680474 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.680524 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.680580 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.680617 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.680661 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.680710 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.681704 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.681752 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.681783 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.681823 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.681863 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmxbj\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-kube-api-access-lmxbj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.681899 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.681964 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.732386 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.732420 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.732914 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.733193 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.733435 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.733992 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.734149 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.734273 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.735283 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.735411 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.736575 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.736628 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmxbj\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-kube-api-access-lmxbj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.737061 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.741487 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:23 crc kubenswrapper[4851]: I1001 13:29:23.845056 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:29:24 crc kubenswrapper[4851]: I1001 13:29:24.420754 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn"] Oct 01 13:29:24 crc kubenswrapper[4851]: W1001 13:29:24.425791 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd4ecbfe_8a57_43e0_9ef9_bdbdd00053e0.slice/crio-468071ed5d14537726aa2b0b41276e8782a6e2ce33ba85937ab07b12c0429bd5 WatchSource:0}: Error finding container 468071ed5d14537726aa2b0b41276e8782a6e2ce33ba85937ab07b12c0429bd5: Status 404 returned error can't find the container with id 468071ed5d14537726aa2b0b41276e8782a6e2ce33ba85937ab07b12c0429bd5 Oct 01 13:29:25 crc kubenswrapper[4851]: I1001 13:29:25.427611 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" event={"ID":"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0","Type":"ContainerStarted","Data":"468071ed5d14537726aa2b0b41276e8782a6e2ce33ba85937ab07b12c0429bd5"} Oct 01 13:29:26 crc kubenswrapper[4851]: I1001 13:29:26.453519 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" event={"ID":"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0","Type":"ContainerStarted","Data":"fde90d86d534916f924554f31d968fd7f5886eea5c102fa49611dd552b7dda92"} Oct 01 13:29:28 crc kubenswrapper[4851]: I1001 13:29:28.474622 4851 generic.go:334] "Generic (PLEG): container finished" podID="86900bcc-74ed-423f-bae9-a9b9dcb9f93c" containerID="c0fb02ef24d3ac04aa45e3cd5fb4a77c610ef1414f98f199b8b6ce6057de547a" exitCode=0 Oct 01 13:29:28 crc kubenswrapper[4851]: I1001 13:29:28.474748 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wm6q" event={"ID":"86900bcc-74ed-423f-bae9-a9b9dcb9f93c","Type":"ContainerDied","Data":"c0fb02ef24d3ac04aa45e3cd5fb4a77c610ef1414f98f199b8b6ce6057de547a"} Oct 01 13:29:28 crc kubenswrapper[4851]: I1001 13:29:28.502187 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" podStartSLOduration=4.845989919 podStartE2EDuration="5.502163172s" podCreationTimestamp="2025-10-01 13:29:23 +0000 UTC" firstStartedPulling="2025-10-01 13:29:24.430434743 +0000 UTC m=+2172.775552229" lastFinishedPulling="2025-10-01 13:29:25.086607956 +0000 UTC m=+2173.431725482" observedRunningTime="2025-10-01 13:29:26.483964227 +0000 UTC m=+2174.829081723" watchObservedRunningTime="2025-10-01 13:29:28.502163172 +0000 UTC m=+2176.847280698" Oct 01 13:29:30 crc kubenswrapper[4851]: I1001 13:29:30.499091 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wm6q" event={"ID":"86900bcc-74ed-423f-bae9-a9b9dcb9f93c","Type":"ContainerStarted","Data":"ea16a8772b96a9eda314c8f7c7290d12d775533a6cfc4a1c3e0b182dcb0656b1"} Oct 01 13:29:30 crc kubenswrapper[4851]: I1001 13:29:30.524878 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5wm6q" podStartSLOduration=3.38493732 podStartE2EDuration="14.524848944s" podCreationTimestamp="2025-10-01 13:29:16 +0000 UTC" firstStartedPulling="2025-10-01 13:29:18.331764391 +0000 UTC m=+2166.676881877" lastFinishedPulling="2025-10-01 13:29:29.471676015 +0000 UTC m=+2177.816793501" observedRunningTime="2025-10-01 13:29:30.524300578 +0000 UTC m=+2178.869418104" watchObservedRunningTime="2025-10-01 13:29:30.524848944 +0000 UTC m=+2178.869966470" Oct 01 13:29:36 crc kubenswrapper[4851]: I1001 13:29:36.739395 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:36 crc kubenswrapper[4851]: I1001 13:29:36.740213 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:36 crc kubenswrapper[4851]: I1001 13:29:36.827613 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:37 crc kubenswrapper[4851]: I1001 13:29:37.637038 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:37 crc kubenswrapper[4851]: I1001 13:29:37.684563 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5wm6q"] Oct 01 13:29:39 crc kubenswrapper[4851]: I1001 13:29:39.599565 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5wm6q" podUID="86900bcc-74ed-423f-bae9-a9b9dcb9f93c" containerName="registry-server" containerID="cri-o://ea16a8772b96a9eda314c8f7c7290d12d775533a6cfc4a1c3e0b182dcb0656b1" gracePeriod=2 Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.060744 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.189340 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-catalog-content\") pod \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\" (UID: \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\") " Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.189774 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mkvh\" (UniqueName: \"kubernetes.io/projected/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-kube-api-access-5mkvh\") pod \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\" (UID: \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\") " Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.189946 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-utilities\") pod \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\" (UID: \"86900bcc-74ed-423f-bae9-a9b9dcb9f93c\") " Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.190636 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-utilities" (OuterVolumeSpecName: "utilities") pod "86900bcc-74ed-423f-bae9-a9b9dcb9f93c" (UID: "86900bcc-74ed-423f-bae9-a9b9dcb9f93c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.208441 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-kube-api-access-5mkvh" (OuterVolumeSpecName: "kube-api-access-5mkvh") pod "86900bcc-74ed-423f-bae9-a9b9dcb9f93c" (UID: "86900bcc-74ed-423f-bae9-a9b9dcb9f93c"). InnerVolumeSpecName "kube-api-access-5mkvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.265692 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86900bcc-74ed-423f-bae9-a9b9dcb9f93c" (UID: "86900bcc-74ed-423f-bae9-a9b9dcb9f93c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.292426 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.292458 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mkvh\" (UniqueName: \"kubernetes.io/projected/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-kube-api-access-5mkvh\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.292469 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86900bcc-74ed-423f-bae9-a9b9dcb9f93c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.614712 4851 generic.go:334] "Generic (PLEG): container finished" podID="86900bcc-74ed-423f-bae9-a9b9dcb9f93c" containerID="ea16a8772b96a9eda314c8f7c7290d12d775533a6cfc4a1c3e0b182dcb0656b1" exitCode=0 Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.614825 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wm6q" event={"ID":"86900bcc-74ed-423f-bae9-a9b9dcb9f93c","Type":"ContainerDied","Data":"ea16a8772b96a9eda314c8f7c7290d12d775533a6cfc4a1c3e0b182dcb0656b1"} Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.614853 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5wm6q" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.614888 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wm6q" event={"ID":"86900bcc-74ed-423f-bae9-a9b9dcb9f93c","Type":"ContainerDied","Data":"9ee118d399cddaeae033eddf197ff181a5841db8b15e79a23e342c51a36a62d7"} Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.614930 4851 scope.go:117] "RemoveContainer" containerID="ea16a8772b96a9eda314c8f7c7290d12d775533a6cfc4a1c3e0b182dcb0656b1" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.648058 4851 scope.go:117] "RemoveContainer" containerID="c0fb02ef24d3ac04aa45e3cd5fb4a77c610ef1414f98f199b8b6ce6057de547a" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.661193 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5wm6q"] Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.671517 4851 scope.go:117] "RemoveContainer" containerID="244d01476244d12196c818118cebc6506d5d89b6a5d91343da2af5fd1c07a141" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.673364 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5wm6q"] Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.726150 4851 scope.go:117] "RemoveContainer" containerID="ea16a8772b96a9eda314c8f7c7290d12d775533a6cfc4a1c3e0b182dcb0656b1" Oct 01 13:29:40 crc kubenswrapper[4851]: E1001 13:29:40.726663 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea16a8772b96a9eda314c8f7c7290d12d775533a6cfc4a1c3e0b182dcb0656b1\": container with ID starting with ea16a8772b96a9eda314c8f7c7290d12d775533a6cfc4a1c3e0b182dcb0656b1 not found: ID does not exist" containerID="ea16a8772b96a9eda314c8f7c7290d12d775533a6cfc4a1c3e0b182dcb0656b1" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.726709 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea16a8772b96a9eda314c8f7c7290d12d775533a6cfc4a1c3e0b182dcb0656b1"} err="failed to get container status \"ea16a8772b96a9eda314c8f7c7290d12d775533a6cfc4a1c3e0b182dcb0656b1\": rpc error: code = NotFound desc = could not find container \"ea16a8772b96a9eda314c8f7c7290d12d775533a6cfc4a1c3e0b182dcb0656b1\": container with ID starting with ea16a8772b96a9eda314c8f7c7290d12d775533a6cfc4a1c3e0b182dcb0656b1 not found: ID does not exist" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.726738 4851 scope.go:117] "RemoveContainer" containerID="c0fb02ef24d3ac04aa45e3cd5fb4a77c610ef1414f98f199b8b6ce6057de547a" Oct 01 13:29:40 crc kubenswrapper[4851]: E1001 13:29:40.727093 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0fb02ef24d3ac04aa45e3cd5fb4a77c610ef1414f98f199b8b6ce6057de547a\": container with ID starting with c0fb02ef24d3ac04aa45e3cd5fb4a77c610ef1414f98f199b8b6ce6057de547a not found: ID does not exist" containerID="c0fb02ef24d3ac04aa45e3cd5fb4a77c610ef1414f98f199b8b6ce6057de547a" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.727148 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0fb02ef24d3ac04aa45e3cd5fb4a77c610ef1414f98f199b8b6ce6057de547a"} err="failed to get container status \"c0fb02ef24d3ac04aa45e3cd5fb4a77c610ef1414f98f199b8b6ce6057de547a\": rpc error: code = NotFound desc = could not find container \"c0fb02ef24d3ac04aa45e3cd5fb4a77c610ef1414f98f199b8b6ce6057de547a\": container with ID starting with c0fb02ef24d3ac04aa45e3cd5fb4a77c610ef1414f98f199b8b6ce6057de547a not found: ID does not exist" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.727186 4851 scope.go:117] "RemoveContainer" containerID="244d01476244d12196c818118cebc6506d5d89b6a5d91343da2af5fd1c07a141" Oct 01 13:29:40 crc kubenswrapper[4851]: E1001 13:29:40.727479 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"244d01476244d12196c818118cebc6506d5d89b6a5d91343da2af5fd1c07a141\": container with ID starting with 244d01476244d12196c818118cebc6506d5d89b6a5d91343da2af5fd1c07a141 not found: ID does not exist" containerID="244d01476244d12196c818118cebc6506d5d89b6a5d91343da2af5fd1c07a141" Oct 01 13:29:40 crc kubenswrapper[4851]: I1001 13:29:40.727543 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244d01476244d12196c818118cebc6506d5d89b6a5d91343da2af5fd1c07a141"} err="failed to get container status \"244d01476244d12196c818118cebc6506d5d89b6a5d91343da2af5fd1c07a141\": rpc error: code = NotFound desc = could not find container \"244d01476244d12196c818118cebc6506d5d89b6a5d91343da2af5fd1c07a141\": container with ID starting with 244d01476244d12196c818118cebc6506d5d89b6a5d91343da2af5fd1c07a141 not found: ID does not exist" Oct 01 13:29:42 crc kubenswrapper[4851]: I1001 13:29:42.355737 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86900bcc-74ed-423f-bae9-a9b9dcb9f93c" path="/var/lib/kubelet/pods/86900bcc-74ed-423f-bae9-a9b9dcb9f93c/volumes" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.051037 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.051746 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.172982 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd"] Oct 01 13:30:00 crc kubenswrapper[4851]: E1001 13:30:00.173569 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86900bcc-74ed-423f-bae9-a9b9dcb9f93c" containerName="extract-content" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.173586 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="86900bcc-74ed-423f-bae9-a9b9dcb9f93c" containerName="extract-content" Oct 01 13:30:00 crc kubenswrapper[4851]: E1001 13:30:00.173609 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86900bcc-74ed-423f-bae9-a9b9dcb9f93c" containerName="registry-server" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.173617 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="86900bcc-74ed-423f-bae9-a9b9dcb9f93c" containerName="registry-server" Oct 01 13:30:00 crc kubenswrapper[4851]: E1001 13:30:00.173640 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86900bcc-74ed-423f-bae9-a9b9dcb9f93c" containerName="extract-utilities" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.173649 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="86900bcc-74ed-423f-bae9-a9b9dcb9f93c" containerName="extract-utilities" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.173870 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="86900bcc-74ed-423f-bae9-a9b9dcb9f93c" containerName="registry-server" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.174689 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.178038 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.178054 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.195018 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd"] Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.242872 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-config-volume\") pod \"collect-profiles-29322090-6hcgd\" (UID: \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.243263 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc8xp\" (UniqueName: \"kubernetes.io/projected/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-kube-api-access-dc8xp\") pod \"collect-profiles-29322090-6hcgd\" (UID: \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.243469 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-secret-volume\") pod \"collect-profiles-29322090-6hcgd\" (UID: \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.345965 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-config-volume\") pod \"collect-profiles-29322090-6hcgd\" (UID: \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.346095 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8xp\" (UniqueName: \"kubernetes.io/projected/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-kube-api-access-dc8xp\") pod \"collect-profiles-29322090-6hcgd\" (UID: \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.346175 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-secret-volume\") pod \"collect-profiles-29322090-6hcgd\" (UID: \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.348148 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-config-volume\") pod \"collect-profiles-29322090-6hcgd\" (UID: \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.353397 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-secret-volume\") pod \"collect-profiles-29322090-6hcgd\" (UID: \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.367869 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc8xp\" (UniqueName: \"kubernetes.io/projected/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-kube-api-access-dc8xp\") pod \"collect-profiles-29322090-6hcgd\" (UID: \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" Oct 01 13:30:00 crc kubenswrapper[4851]: I1001 13:30:00.506195 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" Oct 01 13:30:01 crc kubenswrapper[4851]: I1001 13:30:01.027767 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd"] Oct 01 13:30:01 crc kubenswrapper[4851]: I1001 13:30:01.881046 4851 generic.go:334] "Generic (PLEG): container finished" podID="ffd845e9-6f20-4b95-9d0d-b71c66d0ea05" containerID="1a1570d4f6b5d5f0c56352a31d7a4b8f766c47c9db00f615b20952e6d4d59e87" exitCode=0 Oct 01 13:30:01 crc kubenswrapper[4851]: I1001 13:30:01.881120 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" event={"ID":"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05","Type":"ContainerDied","Data":"1a1570d4f6b5d5f0c56352a31d7a4b8f766c47c9db00f615b20952e6d4d59e87"} Oct 01 13:30:01 crc kubenswrapper[4851]: I1001 13:30:01.881406 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" event={"ID":"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05","Type":"ContainerStarted","Data":"de696ce8f7da96c0d08b9f7cb749419d094b624b9ad82f6dc1874b0a0ef59a78"} Oct 01 13:30:03 crc kubenswrapper[4851]: I1001 13:30:03.310902 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" Oct 01 13:30:03 crc kubenswrapper[4851]: I1001 13:30:03.434820 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-secret-volume\") pod \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\" (UID: \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\") " Oct 01 13:30:03 crc kubenswrapper[4851]: I1001 13:30:03.434925 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc8xp\" (UniqueName: \"kubernetes.io/projected/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-kube-api-access-dc8xp\") pod \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\" (UID: \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\") " Oct 01 13:30:03 crc kubenswrapper[4851]: I1001 13:30:03.435216 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-config-volume\") pod \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\" (UID: \"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05\") " Oct 01 13:30:03 crc kubenswrapper[4851]: I1001 13:30:03.435826 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-config-volume" (OuterVolumeSpecName: "config-volume") pod "ffd845e9-6f20-4b95-9d0d-b71c66d0ea05" (UID: "ffd845e9-6f20-4b95-9d0d-b71c66d0ea05"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:30:03 crc kubenswrapper[4851]: I1001 13:30:03.436178 4851 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:03 crc kubenswrapper[4851]: I1001 13:30:03.447692 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-kube-api-access-dc8xp" (OuterVolumeSpecName: "kube-api-access-dc8xp") pod "ffd845e9-6f20-4b95-9d0d-b71c66d0ea05" (UID: "ffd845e9-6f20-4b95-9d0d-b71c66d0ea05"). InnerVolumeSpecName "kube-api-access-dc8xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:30:03 crc kubenswrapper[4851]: I1001 13:30:03.448842 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ffd845e9-6f20-4b95-9d0d-b71c66d0ea05" (UID: "ffd845e9-6f20-4b95-9d0d-b71c66d0ea05"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:03 crc kubenswrapper[4851]: I1001 13:30:03.538363 4851 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:03 crc kubenswrapper[4851]: I1001 13:30:03.538414 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc8xp\" (UniqueName: \"kubernetes.io/projected/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05-kube-api-access-dc8xp\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:03 crc kubenswrapper[4851]: I1001 13:30:03.902744 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" event={"ID":"ffd845e9-6f20-4b95-9d0d-b71c66d0ea05","Type":"ContainerDied","Data":"de696ce8f7da96c0d08b9f7cb749419d094b624b9ad82f6dc1874b0a0ef59a78"} Oct 01 13:30:03 crc kubenswrapper[4851]: I1001 13:30:03.902793 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de696ce8f7da96c0d08b9f7cb749419d094b624b9ad82f6dc1874b0a0ef59a78" Oct 01 13:30:03 crc kubenswrapper[4851]: I1001 13:30:03.902834 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd" Oct 01 13:30:04 crc kubenswrapper[4851]: I1001 13:30:04.412643 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w"] Oct 01 13:30:04 crc kubenswrapper[4851]: I1001 13:30:04.426539 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-6cg7w"] Oct 01 13:30:06 crc kubenswrapper[4851]: I1001 13:30:06.346601 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23bc52e7-0e50-454f-87ee-b82b608ee34a" path="/var/lib/kubelet/pods/23bc52e7-0e50-454f-87ee-b82b608ee34a/volumes" Oct 01 13:30:10 crc kubenswrapper[4851]: I1001 13:30:10.996221 4851 generic.go:334] "Generic (PLEG): container finished" podID="dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" containerID="fde90d86d534916f924554f31d968fd7f5886eea5c102fa49611dd552b7dda92" exitCode=0 Oct 01 13:30:10 crc kubenswrapper[4851]: I1001 13:30:10.996336 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" event={"ID":"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0","Type":"ContainerDied","Data":"fde90d86d534916f924554f31d968fd7f5886eea5c102fa49611dd552b7dda92"} Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.470001 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.545930 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.546330 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-nova-combined-ca-bundle\") pod \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.546377 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-neutron-metadata-combined-ca-bundle\") pod \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.546428 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.546474 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-telemetry-combined-ca-bundle\") pod \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.546527 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-repo-setup-combined-ca-bundle\") pod \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.546582 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-inventory\") pod \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.546617 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-bootstrap-combined-ca-bundle\") pod \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.546676 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-libvirt-combined-ca-bundle\") pod \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.546721 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.546742 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-ssh-key\") pod \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.547011 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-ovn-combined-ca-bundle\") pod \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.547079 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmxbj\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-kube-api-access-lmxbj\") pod \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.547114 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\" (UID: \"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0\") " Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.553091 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" (UID: "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.553126 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" (UID: "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.554144 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" (UID: "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.554656 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" (UID: "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.555554 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" (UID: "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.555822 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" (UID: "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.557430 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-kube-api-access-lmxbj" (OuterVolumeSpecName: "kube-api-access-lmxbj") pod "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" (UID: "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0"). InnerVolumeSpecName "kube-api-access-lmxbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.557462 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" (UID: "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.559001 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" (UID: "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.559467 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" (UID: "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.559580 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" (UID: "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.564673 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" (UID: "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.593161 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" (UID: "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.607446 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-inventory" (OuterVolumeSpecName: "inventory") pod "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" (UID: "dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.649649 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.649865 4851 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.649927 4851 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.649980 4851 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.650049 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.650104 4851 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.650157 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmxbj\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-kube-api-access-lmxbj\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.650212 4851 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.650265 4851 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.650320 4851 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.650376 4851 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.650430 4851 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.650481 4851 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:12 crc kubenswrapper[4851]: I1001 13:30:12.650560 4851 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.022143 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" event={"ID":"dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0","Type":"ContainerDied","Data":"468071ed5d14537726aa2b0b41276e8782a6e2ce33ba85937ab07b12c0429bd5"} Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.022220 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="468071ed5d14537726aa2b0b41276e8782a6e2ce33ba85937ab07b12c0429bd5" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.022223 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.246382 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc"] Oct 01 13:30:13 crc kubenswrapper[4851]: E1001 13:30:13.246973 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd845e9-6f20-4b95-9d0d-b71c66d0ea05" containerName="collect-profiles" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.246990 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd845e9-6f20-4b95-9d0d-b71c66d0ea05" containerName="collect-profiles" Oct 01 13:30:13 crc kubenswrapper[4851]: E1001 13:30:13.247039 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.247049 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.247298 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd845e9-6f20-4b95-9d0d-b71c66d0ea05" containerName="collect-profiles" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.247337 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.248218 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.251015 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.251032 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.251300 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.251325 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.251714 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.263844 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc"] Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.370176 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.370235 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lzgw\" (UniqueName: \"kubernetes.io/projected/d579751c-546b-4a48-8ae6-f9753609107a-kube-api-access-7lzgw\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.370452 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.370657 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d579751c-546b-4a48-8ae6-f9753609107a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.370905 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.473013 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.473156 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lzgw\" (UniqueName: \"kubernetes.io/projected/d579751c-546b-4a48-8ae6-f9753609107a-kube-api-access-7lzgw\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.473256 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.473474 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d579751c-546b-4a48-8ae6-f9753609107a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.473650 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.474303 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d579751c-546b-4a48-8ae6-f9753609107a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.477763 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.477873 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.486462 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.496597 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lzgw\" (UniqueName: \"kubernetes.io/projected/d579751c-546b-4a48-8ae6-f9753609107a-kube-api-access-7lzgw\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkddc\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:13 crc kubenswrapper[4851]: I1001 13:30:13.576682 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:30:14 crc kubenswrapper[4851]: I1001 13:30:14.146051 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc"] Oct 01 13:30:14 crc kubenswrapper[4851]: W1001 13:30:14.154402 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd579751c_546b_4a48_8ae6_f9753609107a.slice/crio-690a3c5dfc99769fe82435d0c77b5cc105d6dec251b46a36139a3d4af468b57f WatchSource:0}: Error finding container 690a3c5dfc99769fe82435d0c77b5cc105d6dec251b46a36139a3d4af468b57f: Status 404 returned error can't find the container with id 690a3c5dfc99769fe82435d0c77b5cc105d6dec251b46a36139a3d4af468b57f Oct 01 13:30:15 crc kubenswrapper[4851]: I1001 13:30:15.042581 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" event={"ID":"d579751c-546b-4a48-8ae6-f9753609107a","Type":"ContainerStarted","Data":"690a3c5dfc99769fe82435d0c77b5cc105d6dec251b46a36139a3d4af468b57f"} Oct 01 13:30:16 crc kubenswrapper[4851]: I1001 13:30:16.052459 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" event={"ID":"d579751c-546b-4a48-8ae6-f9753609107a","Type":"ContainerStarted","Data":"b65b9ab76158966cd3932ba04176d7622ca5e1ab56571e0cef73612bc71cec13"} Oct 01 13:30:16 crc kubenswrapper[4851]: I1001 13:30:16.069377 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" podStartSLOduration=2.3257615879999998 podStartE2EDuration="3.069356695s" podCreationTimestamp="2025-10-01 13:30:13 +0000 UTC" firstStartedPulling="2025-10-01 13:30:14.157141363 +0000 UTC m=+2222.502258849" lastFinishedPulling="2025-10-01 13:30:14.90073643 +0000 UTC m=+2223.245853956" observedRunningTime="2025-10-01 13:30:16.06638138 +0000 UTC m=+2224.411498886" watchObservedRunningTime="2025-10-01 13:30:16.069356695 +0000 UTC m=+2224.414474181" Oct 01 13:30:18 crc kubenswrapper[4851]: I1001 13:30:18.009922 4851 scope.go:117] "RemoveContainer" containerID="385bb429789121b154f05012ccdccfb383bb2c47dcac308f555f2dad6f87c844" Oct 01 13:30:30 crc kubenswrapper[4851]: I1001 13:30:30.049977 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:30:30 crc kubenswrapper[4851]: I1001 13:30:30.050683 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:31:00 crc kubenswrapper[4851]: I1001 13:31:00.050289 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:31:00 crc kubenswrapper[4851]: I1001 13:31:00.051276 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:31:00 crc kubenswrapper[4851]: I1001 13:31:00.051442 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 13:31:00 crc kubenswrapper[4851]: I1001 13:31:00.053191 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:31:00 crc kubenswrapper[4851]: I1001 13:31:00.053303 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" gracePeriod=600 Oct 01 13:31:00 crc kubenswrapper[4851]: I1001 13:31:00.545460 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb"} Oct 01 13:31:00 crc kubenswrapper[4851]: I1001 13:31:00.545534 4851 scope.go:117] "RemoveContainer" containerID="2cf857fcafcc27114560c9d5ed62e037c4c2e05f3ac9187b9ce4b4a9bc35966e" Oct 01 13:31:00 crc kubenswrapper[4851]: I1001 13:31:00.545465 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" exitCode=0 Oct 01 13:31:00 crc kubenswrapper[4851]: E1001 13:31:00.818356 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:31:01 crc kubenswrapper[4851]: I1001 13:31:01.564814 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:31:01 crc kubenswrapper[4851]: E1001 13:31:01.565443 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:31:16 crc kubenswrapper[4851]: I1001 13:31:16.328988 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:31:16 crc kubenswrapper[4851]: E1001 13:31:16.329802 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:31:27 crc kubenswrapper[4851]: I1001 13:31:27.330764 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:31:27 crc kubenswrapper[4851]: E1001 13:31:27.332170 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:31:29 crc kubenswrapper[4851]: I1001 13:31:29.894902 4851 generic.go:334] "Generic (PLEG): container finished" podID="d579751c-546b-4a48-8ae6-f9753609107a" containerID="b65b9ab76158966cd3932ba04176d7622ca5e1ab56571e0cef73612bc71cec13" exitCode=0 Oct 01 13:31:29 crc kubenswrapper[4851]: I1001 13:31:29.895034 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" event={"ID":"d579751c-546b-4a48-8ae6-f9753609107a","Type":"ContainerDied","Data":"b65b9ab76158966cd3932ba04176d7622ca5e1ab56571e0cef73612bc71cec13"} Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.386974 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.525616 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-inventory\") pod \"d579751c-546b-4a48-8ae6-f9753609107a\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.525684 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-ovn-combined-ca-bundle\") pod \"d579751c-546b-4a48-8ae6-f9753609107a\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.525889 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lzgw\" (UniqueName: \"kubernetes.io/projected/d579751c-546b-4a48-8ae6-f9753609107a-kube-api-access-7lzgw\") pod \"d579751c-546b-4a48-8ae6-f9753609107a\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.525949 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d579751c-546b-4a48-8ae6-f9753609107a-ovncontroller-config-0\") pod \"d579751c-546b-4a48-8ae6-f9753609107a\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.526147 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-ssh-key\") pod \"d579751c-546b-4a48-8ae6-f9753609107a\" (UID: \"d579751c-546b-4a48-8ae6-f9753609107a\") " Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.531792 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d579751c-546b-4a48-8ae6-f9753609107a-kube-api-access-7lzgw" (OuterVolumeSpecName: "kube-api-access-7lzgw") pod "d579751c-546b-4a48-8ae6-f9753609107a" (UID: "d579751c-546b-4a48-8ae6-f9753609107a"). InnerVolumeSpecName "kube-api-access-7lzgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.532678 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d579751c-546b-4a48-8ae6-f9753609107a" (UID: "d579751c-546b-4a48-8ae6-f9753609107a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.565637 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-inventory" (OuterVolumeSpecName: "inventory") pod "d579751c-546b-4a48-8ae6-f9753609107a" (UID: "d579751c-546b-4a48-8ae6-f9753609107a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.567863 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d579751c-546b-4a48-8ae6-f9753609107a-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "d579751c-546b-4a48-8ae6-f9753609107a" (UID: "d579751c-546b-4a48-8ae6-f9753609107a"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.571935 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d579751c-546b-4a48-8ae6-f9753609107a" (UID: "d579751c-546b-4a48-8ae6-f9753609107a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.631779 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.631813 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.631823 4851 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d579751c-546b-4a48-8ae6-f9753609107a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.631833 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lzgw\" (UniqueName: \"kubernetes.io/projected/d579751c-546b-4a48-8ae6-f9753609107a-kube-api-access-7lzgw\") on node \"crc\" DevicePath \"\"" Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.631842 4851 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d579751c-546b-4a48-8ae6-f9753609107a-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.917141 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" event={"ID":"d579751c-546b-4a48-8ae6-f9753609107a","Type":"ContainerDied","Data":"690a3c5dfc99769fe82435d0c77b5cc105d6dec251b46a36139a3d4af468b57f"} Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.917194 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="690a3c5dfc99769fe82435d0c77b5cc105d6dec251b46a36139a3d4af468b57f" Oct 01 13:31:31 crc kubenswrapper[4851]: I1001 13:31:31.917205 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkddc" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.030669 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v"] Oct 01 13:31:32 crc kubenswrapper[4851]: E1001 13:31:32.031170 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d579751c-546b-4a48-8ae6-f9753609107a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.031199 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="d579751c-546b-4a48-8ae6-f9753609107a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.031457 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="d579751c-546b-4a48-8ae6-f9753609107a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.032348 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.036917 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.037174 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.037237 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.037199 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.037413 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.039677 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.039720 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.039797 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzd24\" (UniqueName: \"kubernetes.io/projected/f1395fd2-649e-42f8-b320-5d81ae321978-kube-api-access-gzd24\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.039833 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.039885 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.040072 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.042844 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.061483 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v"] Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.142066 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.142276 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.142395 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.142448 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.142554 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzd24\" (UniqueName: \"kubernetes.io/projected/f1395fd2-649e-42f8-b320-5d81ae321978-kube-api-access-gzd24\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.142607 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.145934 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.147639 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.147720 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.147826 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.155146 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.161849 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzd24\" (UniqueName: \"kubernetes.io/projected/f1395fd2-649e-42f8-b320-5d81ae321978-kube-api-access-gzd24\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.357385 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.911298 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v"] Oct 01 13:31:32 crc kubenswrapper[4851]: I1001 13:31:32.926449 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" event={"ID":"f1395fd2-649e-42f8-b320-5d81ae321978","Type":"ContainerStarted","Data":"c40db9cb664d124d764066852fa9ad53603e06b9921804b5782e9f3bcdadf6e1"} Oct 01 13:31:34 crc kubenswrapper[4851]: I1001 13:31:34.950978 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" event={"ID":"f1395fd2-649e-42f8-b320-5d81ae321978","Type":"ContainerStarted","Data":"27256f4f66048f049cb731571aec2a64ce456bb88976017d28bb32d1d5fe81a4"} Oct 01 13:31:34 crc kubenswrapper[4851]: I1001 13:31:34.984985 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" podStartSLOduration=1.655192336 podStartE2EDuration="2.984962759s" podCreationTimestamp="2025-10-01 13:31:32 +0000 UTC" firstStartedPulling="2025-10-01 13:31:32.91479723 +0000 UTC m=+2301.259914716" lastFinishedPulling="2025-10-01 13:31:34.244567613 +0000 UTC m=+2302.589685139" observedRunningTime="2025-10-01 13:31:34.974326275 +0000 UTC m=+2303.319443801" watchObservedRunningTime="2025-10-01 13:31:34.984962759 +0000 UTC m=+2303.330080255" Oct 01 13:31:38 crc kubenswrapper[4851]: I1001 13:31:38.328649 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:31:38 crc kubenswrapper[4851]: E1001 13:31:38.329322 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:31:50 crc kubenswrapper[4851]: I1001 13:31:50.328127 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:31:50 crc kubenswrapper[4851]: E1001 13:31:50.328830 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:32:01 crc kubenswrapper[4851]: I1001 13:32:01.330312 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:32:01 crc kubenswrapper[4851]: E1001 13:32:01.331321 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:32:12 crc kubenswrapper[4851]: I1001 13:32:12.347547 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:32:12 crc kubenswrapper[4851]: E1001 13:32:12.348351 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:32:24 crc kubenswrapper[4851]: I1001 13:32:24.328707 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:32:24 crc kubenswrapper[4851]: E1001 13:32:24.329684 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:32:28 crc kubenswrapper[4851]: I1001 13:32:28.597781 4851 generic.go:334] "Generic (PLEG): container finished" podID="f1395fd2-649e-42f8-b320-5d81ae321978" containerID="27256f4f66048f049cb731571aec2a64ce456bb88976017d28bb32d1d5fe81a4" exitCode=0 Oct 01 13:32:28 crc kubenswrapper[4851]: I1001 13:32:28.597906 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" event={"ID":"f1395fd2-649e-42f8-b320-5d81ae321978","Type":"ContainerDied","Data":"27256f4f66048f049cb731571aec2a64ce456bb88976017d28bb32d1d5fe81a4"} Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.061288 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.213954 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzd24\" (UniqueName: \"kubernetes.io/projected/f1395fd2-649e-42f8-b320-5d81ae321978-kube-api-access-gzd24\") pod \"f1395fd2-649e-42f8-b320-5d81ae321978\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.214092 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-ssh-key\") pod \"f1395fd2-649e-42f8-b320-5d81ae321978\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.214150 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-inventory\") pod \"f1395fd2-649e-42f8-b320-5d81ae321978\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.214233 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-nova-metadata-neutron-config-0\") pod \"f1395fd2-649e-42f8-b320-5d81ae321978\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.214336 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-neutron-ovn-metadata-agent-neutron-config-0\") pod \"f1395fd2-649e-42f8-b320-5d81ae321978\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.214379 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-neutron-metadata-combined-ca-bundle\") pod \"f1395fd2-649e-42f8-b320-5d81ae321978\" (UID: \"f1395fd2-649e-42f8-b320-5d81ae321978\") " Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.219319 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f1395fd2-649e-42f8-b320-5d81ae321978" (UID: "f1395fd2-649e-42f8-b320-5d81ae321978"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.229086 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1395fd2-649e-42f8-b320-5d81ae321978-kube-api-access-gzd24" (OuterVolumeSpecName: "kube-api-access-gzd24") pod "f1395fd2-649e-42f8-b320-5d81ae321978" (UID: "f1395fd2-649e-42f8-b320-5d81ae321978"). InnerVolumeSpecName "kube-api-access-gzd24". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.243017 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-inventory" (OuterVolumeSpecName: "inventory") pod "f1395fd2-649e-42f8-b320-5d81ae321978" (UID: "f1395fd2-649e-42f8-b320-5d81ae321978"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.244269 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "f1395fd2-649e-42f8-b320-5d81ae321978" (UID: "f1395fd2-649e-42f8-b320-5d81ae321978"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.244745 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f1395fd2-649e-42f8-b320-5d81ae321978" (UID: "f1395fd2-649e-42f8-b320-5d81ae321978"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.245193 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "f1395fd2-649e-42f8-b320-5d81ae321978" (UID: "f1395fd2-649e-42f8-b320-5d81ae321978"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.316487 4851 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.316535 4851 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.316546 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzd24\" (UniqueName: \"kubernetes.io/projected/f1395fd2-649e-42f8-b320-5d81ae321978-kube-api-access-gzd24\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.316555 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.316566 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.316574 4851 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f1395fd2-649e-42f8-b320-5d81ae321978-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.618660 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" event={"ID":"f1395fd2-649e-42f8-b320-5d81ae321978","Type":"ContainerDied","Data":"c40db9cb664d124d764066852fa9ad53603e06b9921804b5782e9f3bcdadf6e1"} Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.618705 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c40db9cb664d124d764066852fa9ad53603e06b9921804b5782e9f3bcdadf6e1" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.618707 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.734080 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l"] Oct 01 13:32:30 crc kubenswrapper[4851]: E1001 13:32:30.734541 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1395fd2-649e-42f8-b320-5d81ae321978" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.734562 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1395fd2-649e-42f8-b320-5d81ae321978" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.734786 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1395fd2-649e-42f8-b320-5d81ae321978" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.735473 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.738268 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.738399 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.738436 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.739870 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.740776 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.744379 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l"] Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.827860 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.828018 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrvq2\" (UniqueName: \"kubernetes.io/projected/18d67c2a-547a-448f-8492-c4c997cc938e-kube-api-access-zrvq2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.828051 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.828367 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.828812 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.930387 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrvq2\" (UniqueName: \"kubernetes.io/projected/18d67c2a-547a-448f-8492-c4c997cc938e-kube-api-access-zrvq2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.930752 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.930848 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.930918 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.930950 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.937171 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.937472 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.937617 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.939999 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:30 crc kubenswrapper[4851]: I1001 13:32:30.960567 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrvq2\" (UniqueName: \"kubernetes.io/projected/18d67c2a-547a-448f-8492-c4c997cc938e-kube-api-access-zrvq2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9s94l\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:31 crc kubenswrapper[4851]: I1001 13:32:31.095951 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:32:31 crc kubenswrapper[4851]: I1001 13:32:31.628169 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l"] Oct 01 13:32:32 crc kubenswrapper[4851]: I1001 13:32:32.645577 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" event={"ID":"18d67c2a-547a-448f-8492-c4c997cc938e","Type":"ContainerStarted","Data":"d89c3f953e1bb101096c929955a5976e8966eada668f39d0e5f6ebbc0c6bcdb9"} Oct 01 13:32:33 crc kubenswrapper[4851]: I1001 13:32:33.677435 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" event={"ID":"18d67c2a-547a-448f-8492-c4c997cc938e","Type":"ContainerStarted","Data":"2cc9bb7b6205443fe39650d9350876b0d2e155b0b9a9ec2ed37df3e3d744ccfa"} Oct 01 13:32:33 crc kubenswrapper[4851]: I1001 13:32:33.697125 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" podStartSLOduration=2.867359831 podStartE2EDuration="3.697106226s" podCreationTimestamp="2025-10-01 13:32:30 +0000 UTC" firstStartedPulling="2025-10-01 13:32:31.632357903 +0000 UTC m=+2359.977475389" lastFinishedPulling="2025-10-01 13:32:32.462104258 +0000 UTC m=+2360.807221784" observedRunningTime="2025-10-01 13:32:33.692890856 +0000 UTC m=+2362.038008342" watchObservedRunningTime="2025-10-01 13:32:33.697106226 +0000 UTC m=+2362.042223712" Oct 01 13:32:37 crc kubenswrapper[4851]: I1001 13:32:37.328756 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:32:37 crc kubenswrapper[4851]: E1001 13:32:37.329745 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:32:49 crc kubenswrapper[4851]: I1001 13:32:49.328601 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:32:49 crc kubenswrapper[4851]: E1001 13:32:49.329650 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:33:02 crc kubenswrapper[4851]: I1001 13:33:02.334329 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:33:02 crc kubenswrapper[4851]: E1001 13:33:02.335223 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:33:17 crc kubenswrapper[4851]: I1001 13:33:17.329303 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:33:17 crc kubenswrapper[4851]: E1001 13:33:17.331404 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:33:28 crc kubenswrapper[4851]: I1001 13:33:28.329398 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:33:28 crc kubenswrapper[4851]: E1001 13:33:28.330381 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:33:39 crc kubenswrapper[4851]: I1001 13:33:39.329047 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:33:39 crc kubenswrapper[4851]: E1001 13:33:39.330010 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:33:50 crc kubenswrapper[4851]: I1001 13:33:50.328466 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:33:50 crc kubenswrapper[4851]: E1001 13:33:50.329480 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:34:02 crc kubenswrapper[4851]: I1001 13:34:02.335783 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:34:02 crc kubenswrapper[4851]: E1001 13:34:02.336576 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:34:17 crc kubenswrapper[4851]: I1001 13:34:17.328970 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:34:17 crc kubenswrapper[4851]: E1001 13:34:17.331174 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:34:30 crc kubenswrapper[4851]: I1001 13:34:30.329770 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:34:30 crc kubenswrapper[4851]: E1001 13:34:30.330742 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:34:44 crc kubenswrapper[4851]: I1001 13:34:44.329260 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:34:44 crc kubenswrapper[4851]: E1001 13:34:44.329992 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:34:45 crc kubenswrapper[4851]: I1001 13:34:45.897519 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-86bc8"] Oct 01 13:34:45 crc kubenswrapper[4851]: I1001 13:34:45.899445 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:45 crc kubenswrapper[4851]: I1001 13:34:45.922172 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86bc8"] Oct 01 13:34:46 crc kubenswrapper[4851]: I1001 13:34:46.035674 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-247g2\" (UniqueName: \"kubernetes.io/projected/5dbb888e-5850-4aca-bc65-386c6e112aac-kube-api-access-247g2\") pod \"community-operators-86bc8\" (UID: \"5dbb888e-5850-4aca-bc65-386c6e112aac\") " pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:46 crc kubenswrapper[4851]: I1001 13:34:46.035735 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dbb888e-5850-4aca-bc65-386c6e112aac-utilities\") pod \"community-operators-86bc8\" (UID: \"5dbb888e-5850-4aca-bc65-386c6e112aac\") " pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:46 crc kubenswrapper[4851]: I1001 13:34:46.036131 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dbb888e-5850-4aca-bc65-386c6e112aac-catalog-content\") pod \"community-operators-86bc8\" (UID: \"5dbb888e-5850-4aca-bc65-386c6e112aac\") " pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:46 crc kubenswrapper[4851]: I1001 13:34:46.138234 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-247g2\" (UniqueName: \"kubernetes.io/projected/5dbb888e-5850-4aca-bc65-386c6e112aac-kube-api-access-247g2\") pod \"community-operators-86bc8\" (UID: \"5dbb888e-5850-4aca-bc65-386c6e112aac\") " pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:46 crc kubenswrapper[4851]: I1001 13:34:46.138309 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dbb888e-5850-4aca-bc65-386c6e112aac-utilities\") pod \"community-operators-86bc8\" (UID: \"5dbb888e-5850-4aca-bc65-386c6e112aac\") " pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:46 crc kubenswrapper[4851]: I1001 13:34:46.138513 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dbb888e-5850-4aca-bc65-386c6e112aac-catalog-content\") pod \"community-operators-86bc8\" (UID: \"5dbb888e-5850-4aca-bc65-386c6e112aac\") " pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:46 crc kubenswrapper[4851]: I1001 13:34:46.139103 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dbb888e-5850-4aca-bc65-386c6e112aac-catalog-content\") pod \"community-operators-86bc8\" (UID: \"5dbb888e-5850-4aca-bc65-386c6e112aac\") " pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:46 crc kubenswrapper[4851]: I1001 13:34:46.139246 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dbb888e-5850-4aca-bc65-386c6e112aac-utilities\") pod \"community-operators-86bc8\" (UID: \"5dbb888e-5850-4aca-bc65-386c6e112aac\") " pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:46 crc kubenswrapper[4851]: I1001 13:34:46.166568 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-247g2\" (UniqueName: \"kubernetes.io/projected/5dbb888e-5850-4aca-bc65-386c6e112aac-kube-api-access-247g2\") pod \"community-operators-86bc8\" (UID: \"5dbb888e-5850-4aca-bc65-386c6e112aac\") " pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:46 crc kubenswrapper[4851]: I1001 13:34:46.227854 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:46 crc kubenswrapper[4851]: I1001 13:34:46.731769 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86bc8"] Oct 01 13:34:46 crc kubenswrapper[4851]: I1001 13:34:46.984422 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86bc8" event={"ID":"5dbb888e-5850-4aca-bc65-386c6e112aac","Type":"ContainerStarted","Data":"1ddd95c950b8bc084fc940ff2128b93b967a20e6574e48e5c191ea87892d231e"} Oct 01 13:34:46 crc kubenswrapper[4851]: I1001 13:34:46.984838 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86bc8" event={"ID":"5dbb888e-5850-4aca-bc65-386c6e112aac","Type":"ContainerStarted","Data":"daacde5d4b0fca96f2b8ec58ba620557fafb281eed189cbc02a51b962b3c30c1"} Oct 01 13:34:47 crc kubenswrapper[4851]: I1001 13:34:47.996108 4851 generic.go:334] "Generic (PLEG): container finished" podID="5dbb888e-5850-4aca-bc65-386c6e112aac" containerID="1ddd95c950b8bc084fc940ff2128b93b967a20e6574e48e5c191ea87892d231e" exitCode=0 Oct 01 13:34:47 crc kubenswrapper[4851]: I1001 13:34:47.996159 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86bc8" event={"ID":"5dbb888e-5850-4aca-bc65-386c6e112aac","Type":"ContainerDied","Data":"1ddd95c950b8bc084fc940ff2128b93b967a20e6574e48e5c191ea87892d231e"} Oct 01 13:34:48 crc kubenswrapper[4851]: I1001 13:34:47.999642 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:34:49 crc kubenswrapper[4851]: I1001 13:34:49.008150 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86bc8" event={"ID":"5dbb888e-5850-4aca-bc65-386c6e112aac","Type":"ContainerStarted","Data":"a734c1ca19a257c3e38cd85d1abe7c46ef8a47ff828703749d24d3355c72154f"} Oct 01 13:34:50 crc kubenswrapper[4851]: I1001 13:34:50.019657 4851 generic.go:334] "Generic (PLEG): container finished" podID="5dbb888e-5850-4aca-bc65-386c6e112aac" containerID="a734c1ca19a257c3e38cd85d1abe7c46ef8a47ff828703749d24d3355c72154f" exitCode=0 Oct 01 13:34:50 crc kubenswrapper[4851]: I1001 13:34:50.019722 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86bc8" event={"ID":"5dbb888e-5850-4aca-bc65-386c6e112aac","Type":"ContainerDied","Data":"a734c1ca19a257c3e38cd85d1abe7c46ef8a47ff828703749d24d3355c72154f"} Oct 01 13:34:51 crc kubenswrapper[4851]: I1001 13:34:51.032195 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86bc8" event={"ID":"5dbb888e-5850-4aca-bc65-386c6e112aac","Type":"ContainerStarted","Data":"f40ab020ee23f565cd919684c4aec00d488898d5a33696a64f20d969c80e54cb"} Oct 01 13:34:51 crc kubenswrapper[4851]: I1001 13:34:51.065300 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-86bc8" podStartSLOduration=3.571510955 podStartE2EDuration="6.065276053s" podCreationTimestamp="2025-10-01 13:34:45 +0000 UTC" firstStartedPulling="2025-10-01 13:34:47.999393499 +0000 UTC m=+2496.344510985" lastFinishedPulling="2025-10-01 13:34:50.493158567 +0000 UTC m=+2498.838276083" observedRunningTime="2025-10-01 13:34:51.056196484 +0000 UTC m=+2499.401314030" watchObservedRunningTime="2025-10-01 13:34:51.065276053 +0000 UTC m=+2499.410393539" Oct 01 13:34:56 crc kubenswrapper[4851]: I1001 13:34:56.228746 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:56 crc kubenswrapper[4851]: I1001 13:34:56.229285 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:56 crc kubenswrapper[4851]: I1001 13:34:56.271668 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:57 crc kubenswrapper[4851]: I1001 13:34:57.168113 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:34:57 crc kubenswrapper[4851]: I1001 13:34:57.214048 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86bc8"] Oct 01 13:34:59 crc kubenswrapper[4851]: I1001 13:34:59.121566 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-86bc8" podUID="5dbb888e-5850-4aca-bc65-386c6e112aac" containerName="registry-server" containerID="cri-o://f40ab020ee23f565cd919684c4aec00d488898d5a33696a64f20d969c80e54cb" gracePeriod=2 Oct 01 13:34:59 crc kubenswrapper[4851]: I1001 13:34:59.329571 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:34:59 crc kubenswrapper[4851]: E1001 13:34:59.330558 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:35:00 crc kubenswrapper[4851]: I1001 13:35:00.135551 4851 generic.go:334] "Generic (PLEG): container finished" podID="5dbb888e-5850-4aca-bc65-386c6e112aac" containerID="f40ab020ee23f565cd919684c4aec00d488898d5a33696a64f20d969c80e54cb" exitCode=0 Oct 01 13:35:00 crc kubenswrapper[4851]: I1001 13:35:00.135701 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86bc8" event={"ID":"5dbb888e-5850-4aca-bc65-386c6e112aac","Type":"ContainerDied","Data":"f40ab020ee23f565cd919684c4aec00d488898d5a33696a64f20d969c80e54cb"} Oct 01 13:35:00 crc kubenswrapper[4851]: I1001 13:35:00.135977 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86bc8" event={"ID":"5dbb888e-5850-4aca-bc65-386c6e112aac","Type":"ContainerDied","Data":"daacde5d4b0fca96f2b8ec58ba620557fafb281eed189cbc02a51b962b3c30c1"} Oct 01 13:35:00 crc kubenswrapper[4851]: I1001 13:35:00.135998 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daacde5d4b0fca96f2b8ec58ba620557fafb281eed189cbc02a51b962b3c30c1" Oct 01 13:35:00 crc kubenswrapper[4851]: I1001 13:35:00.239850 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:35:00 crc kubenswrapper[4851]: I1001 13:35:00.419369 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-247g2\" (UniqueName: \"kubernetes.io/projected/5dbb888e-5850-4aca-bc65-386c6e112aac-kube-api-access-247g2\") pod \"5dbb888e-5850-4aca-bc65-386c6e112aac\" (UID: \"5dbb888e-5850-4aca-bc65-386c6e112aac\") " Oct 01 13:35:00 crc kubenswrapper[4851]: I1001 13:35:00.419445 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dbb888e-5850-4aca-bc65-386c6e112aac-utilities\") pod \"5dbb888e-5850-4aca-bc65-386c6e112aac\" (UID: \"5dbb888e-5850-4aca-bc65-386c6e112aac\") " Oct 01 13:35:00 crc kubenswrapper[4851]: I1001 13:35:00.419755 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dbb888e-5850-4aca-bc65-386c6e112aac-catalog-content\") pod \"5dbb888e-5850-4aca-bc65-386c6e112aac\" (UID: \"5dbb888e-5850-4aca-bc65-386c6e112aac\") " Oct 01 13:35:00 crc kubenswrapper[4851]: I1001 13:35:00.425054 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dbb888e-5850-4aca-bc65-386c6e112aac-utilities" (OuterVolumeSpecName: "utilities") pod "5dbb888e-5850-4aca-bc65-386c6e112aac" (UID: "5dbb888e-5850-4aca-bc65-386c6e112aac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:35:00 crc kubenswrapper[4851]: I1001 13:35:00.430159 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dbb888e-5850-4aca-bc65-386c6e112aac-kube-api-access-247g2" (OuterVolumeSpecName: "kube-api-access-247g2") pod "5dbb888e-5850-4aca-bc65-386c6e112aac" (UID: "5dbb888e-5850-4aca-bc65-386c6e112aac"). InnerVolumeSpecName "kube-api-access-247g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:35:00 crc kubenswrapper[4851]: I1001 13:35:00.525716 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-247g2\" (UniqueName: \"kubernetes.io/projected/5dbb888e-5850-4aca-bc65-386c6e112aac-kube-api-access-247g2\") on node \"crc\" DevicePath \"\"" Oct 01 13:35:00 crc kubenswrapper[4851]: I1001 13:35:00.525754 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dbb888e-5850-4aca-bc65-386c6e112aac-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:35:00 crc kubenswrapper[4851]: I1001 13:35:00.657078 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dbb888e-5850-4aca-bc65-386c6e112aac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5dbb888e-5850-4aca-bc65-386c6e112aac" (UID: "5dbb888e-5850-4aca-bc65-386c6e112aac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:35:00 crc kubenswrapper[4851]: I1001 13:35:00.728213 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dbb888e-5850-4aca-bc65-386c6e112aac-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:35:01 crc kubenswrapper[4851]: I1001 13:35:01.143718 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86bc8" Oct 01 13:35:01 crc kubenswrapper[4851]: I1001 13:35:01.180143 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86bc8"] Oct 01 13:35:01 crc kubenswrapper[4851]: I1001 13:35:01.190374 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-86bc8"] Oct 01 13:35:02 crc kubenswrapper[4851]: I1001 13:35:02.340080 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dbb888e-5850-4aca-bc65-386c6e112aac" path="/var/lib/kubelet/pods/5dbb888e-5850-4aca-bc65-386c6e112aac/volumes" Oct 01 13:35:11 crc kubenswrapper[4851]: I1001 13:35:11.329368 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:35:11 crc kubenswrapper[4851]: E1001 13:35:11.331351 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:35:24 crc kubenswrapper[4851]: I1001 13:35:24.330424 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:35:24 crc kubenswrapper[4851]: E1001 13:35:24.331199 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:35:37 crc kubenswrapper[4851]: I1001 13:35:37.328742 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:35:37 crc kubenswrapper[4851]: E1001 13:35:37.329573 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:35:49 crc kubenswrapper[4851]: I1001 13:35:49.329005 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:35:49 crc kubenswrapper[4851]: E1001 13:35:49.329982 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:36:03 crc kubenswrapper[4851]: I1001 13:36:03.329732 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:36:03 crc kubenswrapper[4851]: I1001 13:36:03.801533 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"8c40691f1f8a44c2600070635532cee52f03666d706039454d1cc4f73511fa56"} Oct 01 13:37:34 crc kubenswrapper[4851]: I1001 13:37:34.707134 4851 generic.go:334] "Generic (PLEG): container finished" podID="18d67c2a-547a-448f-8492-c4c997cc938e" containerID="2cc9bb7b6205443fe39650d9350876b0d2e155b0b9a9ec2ed37df3e3d744ccfa" exitCode=0 Oct 01 13:37:34 crc kubenswrapper[4851]: I1001 13:37:34.707183 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" event={"ID":"18d67c2a-547a-448f-8492-c4c997cc938e","Type":"ContainerDied","Data":"2cc9bb7b6205443fe39650d9350876b0d2e155b0b9a9ec2ed37df3e3d744ccfa"} Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.136743 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.214307 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrvq2\" (UniqueName: \"kubernetes.io/projected/18d67c2a-547a-448f-8492-c4c997cc938e-kube-api-access-zrvq2\") pod \"18d67c2a-547a-448f-8492-c4c997cc938e\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.214371 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-inventory\") pod \"18d67c2a-547a-448f-8492-c4c997cc938e\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.214414 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-libvirt-secret-0\") pod \"18d67c2a-547a-448f-8492-c4c997cc938e\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.214465 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-libvirt-combined-ca-bundle\") pod \"18d67c2a-547a-448f-8492-c4c997cc938e\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.214525 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-ssh-key\") pod \"18d67c2a-547a-448f-8492-c4c997cc938e\" (UID: \"18d67c2a-547a-448f-8492-c4c997cc938e\") " Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.220233 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "18d67c2a-547a-448f-8492-c4c997cc938e" (UID: "18d67c2a-547a-448f-8492-c4c997cc938e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.220946 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d67c2a-547a-448f-8492-c4c997cc938e-kube-api-access-zrvq2" (OuterVolumeSpecName: "kube-api-access-zrvq2") pod "18d67c2a-547a-448f-8492-c4c997cc938e" (UID: "18d67c2a-547a-448f-8492-c4c997cc938e"). InnerVolumeSpecName "kube-api-access-zrvq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.244652 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-inventory" (OuterVolumeSpecName: "inventory") pod "18d67c2a-547a-448f-8492-c4c997cc938e" (UID: "18d67c2a-547a-448f-8492-c4c997cc938e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.244998 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "18d67c2a-547a-448f-8492-c4c997cc938e" (UID: "18d67c2a-547a-448f-8492-c4c997cc938e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.251705 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "18d67c2a-547a-448f-8492-c4c997cc938e" (UID: "18d67c2a-547a-448f-8492-c4c997cc938e"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.317296 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrvq2\" (UniqueName: \"kubernetes.io/projected/18d67c2a-547a-448f-8492-c4c997cc938e-kube-api-access-zrvq2\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.317657 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.317669 4851 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.317678 4851 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.317690 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18d67c2a-547a-448f-8492-c4c997cc938e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.730448 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" event={"ID":"18d67c2a-547a-448f-8492-c4c997cc938e","Type":"ContainerDied","Data":"d89c3f953e1bb101096c929955a5976e8966eada668f39d0e5f6ebbc0c6bcdb9"} Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.730514 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d89c3f953e1bb101096c929955a5976e8966eada668f39d0e5f6ebbc0c6bcdb9" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.730583 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9s94l" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.843069 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx"] Oct 01 13:37:36 crc kubenswrapper[4851]: E1001 13:37:36.843610 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbb888e-5850-4aca-bc65-386c6e112aac" containerName="registry-server" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.843636 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbb888e-5850-4aca-bc65-386c6e112aac" containerName="registry-server" Oct 01 13:37:36 crc kubenswrapper[4851]: E1001 13:37:36.843662 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbb888e-5850-4aca-bc65-386c6e112aac" containerName="extract-utilities" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.843671 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbb888e-5850-4aca-bc65-386c6e112aac" containerName="extract-utilities" Oct 01 13:37:36 crc kubenswrapper[4851]: E1001 13:37:36.843694 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d67c2a-547a-448f-8492-c4c997cc938e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.843704 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d67c2a-547a-448f-8492-c4c997cc938e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 13:37:36 crc kubenswrapper[4851]: E1001 13:37:36.843716 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbb888e-5850-4aca-bc65-386c6e112aac" containerName="extract-content" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.843723 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbb888e-5850-4aca-bc65-386c6e112aac" containerName="extract-content" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.843909 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d67c2a-547a-448f-8492-c4c997cc938e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.843928 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dbb888e-5850-4aca-bc65-386c6e112aac" containerName="registry-server" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.844746 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.851215 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.851277 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.851463 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.851511 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.851720 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.851593 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.851594 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.876614 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx"] Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.929433 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.929732 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/75a874c6-cd94-467a-ab74-bede44646604-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.930073 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.930124 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.930171 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzbzb\" (UniqueName: \"kubernetes.io/projected/75a874c6-cd94-467a-ab74-bede44646604-kube-api-access-vzbzb\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.930253 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.930303 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.930365 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:36 crc kubenswrapper[4851]: I1001 13:37:36.930441 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.032991 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.033070 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.033127 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzbzb\" (UniqueName: \"kubernetes.io/projected/75a874c6-cd94-467a-ab74-bede44646604-kube-api-access-vzbzb\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.033178 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.033210 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.033250 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.033305 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.033555 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.033648 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/75a874c6-cd94-467a-ab74-bede44646604-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.034645 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/75a874c6-cd94-467a-ab74-bede44646604-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.037732 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.037742 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.037909 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.038347 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.039890 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.041011 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.041521 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.060411 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzbzb\" (UniqueName: \"kubernetes.io/projected/75a874c6-cd94-467a-ab74-bede44646604-kube-api-access-vzbzb\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5sjkx\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.175910 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:37:37 crc kubenswrapper[4851]: I1001 13:37:37.755987 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx"] Oct 01 13:37:38 crc kubenswrapper[4851]: I1001 13:37:38.751092 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" event={"ID":"75a874c6-cd94-467a-ab74-bede44646604","Type":"ContainerStarted","Data":"d160465148f3bd699a84e719b81fdb95aa24a27c55a7a80047703b9220d33af8"} Oct 01 13:37:38 crc kubenswrapper[4851]: I1001 13:37:38.751448 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" event={"ID":"75a874c6-cd94-467a-ab74-bede44646604","Type":"ContainerStarted","Data":"7a3de0685213bcc67c692d9181113650e239a37585f8b844171df08e915731f9"} Oct 01 13:37:38 crc kubenswrapper[4851]: I1001 13:37:38.771473 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" podStartSLOduration=2.088256803 podStartE2EDuration="2.771452268s" podCreationTimestamp="2025-10-01 13:37:36 +0000 UTC" firstStartedPulling="2025-10-01 13:37:37.770799704 +0000 UTC m=+2666.115917190" lastFinishedPulling="2025-10-01 13:37:38.453995169 +0000 UTC m=+2666.799112655" observedRunningTime="2025-10-01 13:37:38.770580883 +0000 UTC m=+2667.115698369" watchObservedRunningTime="2025-10-01 13:37:38.771452268 +0000 UTC m=+2667.116569774" Oct 01 13:38:18 crc kubenswrapper[4851]: I1001 13:38:18.260766 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c8v28"] Oct 01 13:38:18 crc kubenswrapper[4851]: I1001 13:38:18.264884 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:18 crc kubenswrapper[4851]: I1001 13:38:18.278693 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8v28"] Oct 01 13:38:18 crc kubenswrapper[4851]: I1001 13:38:18.407792 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwhkd\" (UniqueName: \"kubernetes.io/projected/3adab9bf-b139-4054-b37c-86bb01e01c3b-kube-api-access-lwhkd\") pod \"redhat-marketplace-c8v28\" (UID: \"3adab9bf-b139-4054-b37c-86bb01e01c3b\") " pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:18 crc kubenswrapper[4851]: I1001 13:38:18.408184 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3adab9bf-b139-4054-b37c-86bb01e01c3b-catalog-content\") pod \"redhat-marketplace-c8v28\" (UID: \"3adab9bf-b139-4054-b37c-86bb01e01c3b\") " pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:18 crc kubenswrapper[4851]: I1001 13:38:18.408248 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3adab9bf-b139-4054-b37c-86bb01e01c3b-utilities\") pod \"redhat-marketplace-c8v28\" (UID: \"3adab9bf-b139-4054-b37c-86bb01e01c3b\") " pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:18 crc kubenswrapper[4851]: I1001 13:38:18.511466 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3adab9bf-b139-4054-b37c-86bb01e01c3b-catalog-content\") pod \"redhat-marketplace-c8v28\" (UID: \"3adab9bf-b139-4054-b37c-86bb01e01c3b\") " pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:18 crc kubenswrapper[4851]: I1001 13:38:18.511596 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3adab9bf-b139-4054-b37c-86bb01e01c3b-utilities\") pod \"redhat-marketplace-c8v28\" (UID: \"3adab9bf-b139-4054-b37c-86bb01e01c3b\") " pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:18 crc kubenswrapper[4851]: I1001 13:38:18.511852 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwhkd\" (UniqueName: \"kubernetes.io/projected/3adab9bf-b139-4054-b37c-86bb01e01c3b-kube-api-access-lwhkd\") pod \"redhat-marketplace-c8v28\" (UID: \"3adab9bf-b139-4054-b37c-86bb01e01c3b\") " pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:18 crc kubenswrapper[4851]: I1001 13:38:18.512089 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3adab9bf-b139-4054-b37c-86bb01e01c3b-catalog-content\") pod \"redhat-marketplace-c8v28\" (UID: \"3adab9bf-b139-4054-b37c-86bb01e01c3b\") " pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:18 crc kubenswrapper[4851]: I1001 13:38:18.512422 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3adab9bf-b139-4054-b37c-86bb01e01c3b-utilities\") pod \"redhat-marketplace-c8v28\" (UID: \"3adab9bf-b139-4054-b37c-86bb01e01c3b\") " pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:18 crc kubenswrapper[4851]: I1001 13:38:18.560718 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwhkd\" (UniqueName: \"kubernetes.io/projected/3adab9bf-b139-4054-b37c-86bb01e01c3b-kube-api-access-lwhkd\") pod \"redhat-marketplace-c8v28\" (UID: \"3adab9bf-b139-4054-b37c-86bb01e01c3b\") " pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:18 crc kubenswrapper[4851]: I1001 13:38:18.597202 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:19 crc kubenswrapper[4851]: I1001 13:38:19.068602 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8v28"] Oct 01 13:38:19 crc kubenswrapper[4851]: W1001 13:38:19.080568 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3adab9bf_b139_4054_b37c_86bb01e01c3b.slice/crio-a63fa7a217d7fb2caf32a29cd738a98226962b11b8eae9810f0002bf259112ee WatchSource:0}: Error finding container a63fa7a217d7fb2caf32a29cd738a98226962b11b8eae9810f0002bf259112ee: Status 404 returned error can't find the container with id a63fa7a217d7fb2caf32a29cd738a98226962b11b8eae9810f0002bf259112ee Oct 01 13:38:19 crc kubenswrapper[4851]: I1001 13:38:19.144491 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8v28" event={"ID":"3adab9bf-b139-4054-b37c-86bb01e01c3b","Type":"ContainerStarted","Data":"a63fa7a217d7fb2caf32a29cd738a98226962b11b8eae9810f0002bf259112ee"} Oct 01 13:38:20 crc kubenswrapper[4851]: I1001 13:38:20.158853 4851 generic.go:334] "Generic (PLEG): container finished" podID="3adab9bf-b139-4054-b37c-86bb01e01c3b" containerID="de7d1f14a8dc4b9c38406ce56147183cae9a16b0ee292f76f1a167eba5d538e5" exitCode=0 Oct 01 13:38:20 crc kubenswrapper[4851]: I1001 13:38:20.158935 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8v28" event={"ID":"3adab9bf-b139-4054-b37c-86bb01e01c3b","Type":"ContainerDied","Data":"de7d1f14a8dc4b9c38406ce56147183cae9a16b0ee292f76f1a167eba5d538e5"} Oct 01 13:38:21 crc kubenswrapper[4851]: I1001 13:38:21.168752 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8v28" event={"ID":"3adab9bf-b139-4054-b37c-86bb01e01c3b","Type":"ContainerStarted","Data":"58f17a9a26b6fc3d247a6cfe47f14a685d8a1a03c6c291630f44f6f1bcb119a6"} Oct 01 13:38:22 crc kubenswrapper[4851]: I1001 13:38:22.183002 4851 generic.go:334] "Generic (PLEG): container finished" podID="3adab9bf-b139-4054-b37c-86bb01e01c3b" containerID="58f17a9a26b6fc3d247a6cfe47f14a685d8a1a03c6c291630f44f6f1bcb119a6" exitCode=0 Oct 01 13:38:22 crc kubenswrapper[4851]: I1001 13:38:22.183054 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8v28" event={"ID":"3adab9bf-b139-4054-b37c-86bb01e01c3b","Type":"ContainerDied","Data":"58f17a9a26b6fc3d247a6cfe47f14a685d8a1a03c6c291630f44f6f1bcb119a6"} Oct 01 13:38:23 crc kubenswrapper[4851]: I1001 13:38:23.192732 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8v28" event={"ID":"3adab9bf-b139-4054-b37c-86bb01e01c3b","Type":"ContainerStarted","Data":"fdb8ba43788e5c7944628a7dba440b18c665695da106c0ba098646aae2767f51"} Oct 01 13:38:23 crc kubenswrapper[4851]: I1001 13:38:23.214036 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c8v28" podStartSLOduration=2.7521454309999998 podStartE2EDuration="5.214015981s" podCreationTimestamp="2025-10-01 13:38:18 +0000 UTC" firstStartedPulling="2025-10-01 13:38:20.163349699 +0000 UTC m=+2708.508467195" lastFinishedPulling="2025-10-01 13:38:22.625220259 +0000 UTC m=+2710.970337745" observedRunningTime="2025-10-01 13:38:23.207735551 +0000 UTC m=+2711.552853057" watchObservedRunningTime="2025-10-01 13:38:23.214015981 +0000 UTC m=+2711.559133467" Oct 01 13:38:28 crc kubenswrapper[4851]: I1001 13:38:28.597646 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:28 crc kubenswrapper[4851]: I1001 13:38:28.598266 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:28 crc kubenswrapper[4851]: I1001 13:38:28.664190 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:29 crc kubenswrapper[4851]: I1001 13:38:29.305781 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:29 crc kubenswrapper[4851]: I1001 13:38:29.372691 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8v28"] Oct 01 13:38:30 crc kubenswrapper[4851]: I1001 13:38:30.050925 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:38:30 crc kubenswrapper[4851]: I1001 13:38:30.051033 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:38:31 crc kubenswrapper[4851]: I1001 13:38:31.269485 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c8v28" podUID="3adab9bf-b139-4054-b37c-86bb01e01c3b" containerName="registry-server" containerID="cri-o://fdb8ba43788e5c7944628a7dba440b18c665695da106c0ba098646aae2767f51" gracePeriod=2 Oct 01 13:38:31 crc kubenswrapper[4851]: I1001 13:38:31.797546 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:31 crc kubenswrapper[4851]: I1001 13:38:31.883157 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3adab9bf-b139-4054-b37c-86bb01e01c3b-catalog-content\") pod \"3adab9bf-b139-4054-b37c-86bb01e01c3b\" (UID: \"3adab9bf-b139-4054-b37c-86bb01e01c3b\") " Oct 01 13:38:31 crc kubenswrapper[4851]: I1001 13:38:31.883410 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3adab9bf-b139-4054-b37c-86bb01e01c3b-utilities\") pod \"3adab9bf-b139-4054-b37c-86bb01e01c3b\" (UID: \"3adab9bf-b139-4054-b37c-86bb01e01c3b\") " Oct 01 13:38:31 crc kubenswrapper[4851]: I1001 13:38:31.883454 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwhkd\" (UniqueName: \"kubernetes.io/projected/3adab9bf-b139-4054-b37c-86bb01e01c3b-kube-api-access-lwhkd\") pod \"3adab9bf-b139-4054-b37c-86bb01e01c3b\" (UID: \"3adab9bf-b139-4054-b37c-86bb01e01c3b\") " Oct 01 13:38:31 crc kubenswrapper[4851]: I1001 13:38:31.884268 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3adab9bf-b139-4054-b37c-86bb01e01c3b-utilities" (OuterVolumeSpecName: "utilities") pod "3adab9bf-b139-4054-b37c-86bb01e01c3b" (UID: "3adab9bf-b139-4054-b37c-86bb01e01c3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:38:31 crc kubenswrapper[4851]: I1001 13:38:31.889751 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3adab9bf-b139-4054-b37c-86bb01e01c3b-kube-api-access-lwhkd" (OuterVolumeSpecName: "kube-api-access-lwhkd") pod "3adab9bf-b139-4054-b37c-86bb01e01c3b" (UID: "3adab9bf-b139-4054-b37c-86bb01e01c3b"). InnerVolumeSpecName "kube-api-access-lwhkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:38:31 crc kubenswrapper[4851]: I1001 13:38:31.912404 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3adab9bf-b139-4054-b37c-86bb01e01c3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3adab9bf-b139-4054-b37c-86bb01e01c3b" (UID: "3adab9bf-b139-4054-b37c-86bb01e01c3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:38:31 crc kubenswrapper[4851]: I1001 13:38:31.986397 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3adab9bf-b139-4054-b37c-86bb01e01c3b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:38:31 crc kubenswrapper[4851]: I1001 13:38:31.986459 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwhkd\" (UniqueName: \"kubernetes.io/projected/3adab9bf-b139-4054-b37c-86bb01e01c3b-kube-api-access-lwhkd\") on node \"crc\" DevicePath \"\"" Oct 01 13:38:31 crc kubenswrapper[4851]: I1001 13:38:31.986483 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3adab9bf-b139-4054-b37c-86bb01e01c3b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.282874 4851 generic.go:334] "Generic (PLEG): container finished" podID="3adab9bf-b139-4054-b37c-86bb01e01c3b" containerID="fdb8ba43788e5c7944628a7dba440b18c665695da106c0ba098646aae2767f51" exitCode=0 Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.282945 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8v28" event={"ID":"3adab9bf-b139-4054-b37c-86bb01e01c3b","Type":"ContainerDied","Data":"fdb8ba43788e5c7944628a7dba440b18c665695da106c0ba098646aae2767f51"} Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.283282 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8v28" event={"ID":"3adab9bf-b139-4054-b37c-86bb01e01c3b","Type":"ContainerDied","Data":"a63fa7a217d7fb2caf32a29cd738a98226962b11b8eae9810f0002bf259112ee"} Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.283312 4851 scope.go:117] "RemoveContainer" containerID="fdb8ba43788e5c7944628a7dba440b18c665695da106c0ba098646aae2767f51" Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.282995 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8v28" Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.308337 4851 scope.go:117] "RemoveContainer" containerID="58f17a9a26b6fc3d247a6cfe47f14a685d8a1a03c6c291630f44f6f1bcb119a6" Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.350981 4851 scope.go:117] "RemoveContainer" containerID="de7d1f14a8dc4b9c38406ce56147183cae9a16b0ee292f76f1a167eba5d538e5" Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.370025 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8v28"] Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.372045 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8v28"] Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.407334 4851 scope.go:117] "RemoveContainer" containerID="fdb8ba43788e5c7944628a7dba440b18c665695da106c0ba098646aae2767f51" Oct 01 13:38:32 crc kubenswrapper[4851]: E1001 13:38:32.408009 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb8ba43788e5c7944628a7dba440b18c665695da106c0ba098646aae2767f51\": container with ID starting with fdb8ba43788e5c7944628a7dba440b18c665695da106c0ba098646aae2767f51 not found: ID does not exist" containerID="fdb8ba43788e5c7944628a7dba440b18c665695da106c0ba098646aae2767f51" Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.408057 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb8ba43788e5c7944628a7dba440b18c665695da106c0ba098646aae2767f51"} err="failed to get container status \"fdb8ba43788e5c7944628a7dba440b18c665695da106c0ba098646aae2767f51\": rpc error: code = NotFound desc = could not find container \"fdb8ba43788e5c7944628a7dba440b18c665695da106c0ba098646aae2767f51\": container with ID starting with fdb8ba43788e5c7944628a7dba440b18c665695da106c0ba098646aae2767f51 not found: ID does not exist" Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.408085 4851 scope.go:117] "RemoveContainer" containerID="58f17a9a26b6fc3d247a6cfe47f14a685d8a1a03c6c291630f44f6f1bcb119a6" Oct 01 13:38:32 crc kubenswrapper[4851]: E1001 13:38:32.408577 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58f17a9a26b6fc3d247a6cfe47f14a685d8a1a03c6c291630f44f6f1bcb119a6\": container with ID starting with 58f17a9a26b6fc3d247a6cfe47f14a685d8a1a03c6c291630f44f6f1bcb119a6 not found: ID does not exist" containerID="58f17a9a26b6fc3d247a6cfe47f14a685d8a1a03c6c291630f44f6f1bcb119a6" Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.408637 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58f17a9a26b6fc3d247a6cfe47f14a685d8a1a03c6c291630f44f6f1bcb119a6"} err="failed to get container status \"58f17a9a26b6fc3d247a6cfe47f14a685d8a1a03c6c291630f44f6f1bcb119a6\": rpc error: code = NotFound desc = could not find container \"58f17a9a26b6fc3d247a6cfe47f14a685d8a1a03c6c291630f44f6f1bcb119a6\": container with ID starting with 58f17a9a26b6fc3d247a6cfe47f14a685d8a1a03c6c291630f44f6f1bcb119a6 not found: ID does not exist" Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.408672 4851 scope.go:117] "RemoveContainer" containerID="de7d1f14a8dc4b9c38406ce56147183cae9a16b0ee292f76f1a167eba5d538e5" Oct 01 13:38:32 crc kubenswrapper[4851]: E1001 13:38:32.409036 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de7d1f14a8dc4b9c38406ce56147183cae9a16b0ee292f76f1a167eba5d538e5\": container with ID starting with de7d1f14a8dc4b9c38406ce56147183cae9a16b0ee292f76f1a167eba5d538e5 not found: ID does not exist" containerID="de7d1f14a8dc4b9c38406ce56147183cae9a16b0ee292f76f1a167eba5d538e5" Oct 01 13:38:32 crc kubenswrapper[4851]: I1001 13:38:32.409069 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de7d1f14a8dc4b9c38406ce56147183cae9a16b0ee292f76f1a167eba5d538e5"} err="failed to get container status \"de7d1f14a8dc4b9c38406ce56147183cae9a16b0ee292f76f1a167eba5d538e5\": rpc error: code = NotFound desc = could not find container \"de7d1f14a8dc4b9c38406ce56147183cae9a16b0ee292f76f1a167eba5d538e5\": container with ID starting with de7d1f14a8dc4b9c38406ce56147183cae9a16b0ee292f76f1a167eba5d538e5 not found: ID does not exist" Oct 01 13:38:34 crc kubenswrapper[4851]: I1001 13:38:34.345019 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3adab9bf-b139-4054-b37c-86bb01e01c3b" path="/var/lib/kubelet/pods/3adab9bf-b139-4054-b37c-86bb01e01c3b/volumes" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.610738 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pqsbf"] Oct 01 13:38:42 crc kubenswrapper[4851]: E1001 13:38:42.613791 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3adab9bf-b139-4054-b37c-86bb01e01c3b" containerName="registry-server" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.613819 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3adab9bf-b139-4054-b37c-86bb01e01c3b" containerName="registry-server" Oct 01 13:38:42 crc kubenswrapper[4851]: E1001 13:38:42.613838 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3adab9bf-b139-4054-b37c-86bb01e01c3b" containerName="extract-utilities" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.613846 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3adab9bf-b139-4054-b37c-86bb01e01c3b" containerName="extract-utilities" Oct 01 13:38:42 crc kubenswrapper[4851]: E1001 13:38:42.613866 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3adab9bf-b139-4054-b37c-86bb01e01c3b" containerName="extract-content" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.613874 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3adab9bf-b139-4054-b37c-86bb01e01c3b" containerName="extract-content" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.614656 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="3adab9bf-b139-4054-b37c-86bb01e01c3b" containerName="registry-server" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.616868 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.638294 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pqsbf"] Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.732218 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps76p\" (UniqueName: \"kubernetes.io/projected/bcde5055-0260-4621-b8f6-ee46c09720de-kube-api-access-ps76p\") pod \"certified-operators-pqsbf\" (UID: \"bcde5055-0260-4621-b8f6-ee46c09720de\") " pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.732711 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcde5055-0260-4621-b8f6-ee46c09720de-utilities\") pod \"certified-operators-pqsbf\" (UID: \"bcde5055-0260-4621-b8f6-ee46c09720de\") " pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.732870 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcde5055-0260-4621-b8f6-ee46c09720de-catalog-content\") pod \"certified-operators-pqsbf\" (UID: \"bcde5055-0260-4621-b8f6-ee46c09720de\") " pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.835134 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps76p\" (UniqueName: \"kubernetes.io/projected/bcde5055-0260-4621-b8f6-ee46c09720de-kube-api-access-ps76p\") pod \"certified-operators-pqsbf\" (UID: \"bcde5055-0260-4621-b8f6-ee46c09720de\") " pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.835286 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcde5055-0260-4621-b8f6-ee46c09720de-utilities\") pod \"certified-operators-pqsbf\" (UID: \"bcde5055-0260-4621-b8f6-ee46c09720de\") " pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.835356 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcde5055-0260-4621-b8f6-ee46c09720de-catalog-content\") pod \"certified-operators-pqsbf\" (UID: \"bcde5055-0260-4621-b8f6-ee46c09720de\") " pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.835948 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcde5055-0260-4621-b8f6-ee46c09720de-catalog-content\") pod \"certified-operators-pqsbf\" (UID: \"bcde5055-0260-4621-b8f6-ee46c09720de\") " pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.835960 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcde5055-0260-4621-b8f6-ee46c09720de-utilities\") pod \"certified-operators-pqsbf\" (UID: \"bcde5055-0260-4621-b8f6-ee46c09720de\") " pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.861109 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps76p\" (UniqueName: \"kubernetes.io/projected/bcde5055-0260-4621-b8f6-ee46c09720de-kube-api-access-ps76p\") pod \"certified-operators-pqsbf\" (UID: \"bcde5055-0260-4621-b8f6-ee46c09720de\") " pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:42 crc kubenswrapper[4851]: I1001 13:38:42.939820 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:43 crc kubenswrapper[4851]: I1001 13:38:43.456401 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pqsbf"] Oct 01 13:38:44 crc kubenswrapper[4851]: I1001 13:38:44.403739 4851 generic.go:334] "Generic (PLEG): container finished" podID="bcde5055-0260-4621-b8f6-ee46c09720de" containerID="70ab43d643fdeedd05fa9c681ef824045bc3ae140af76f405da83d6071fc4a63" exitCode=0 Oct 01 13:38:44 crc kubenswrapper[4851]: I1001 13:38:44.403944 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pqsbf" event={"ID":"bcde5055-0260-4621-b8f6-ee46c09720de","Type":"ContainerDied","Data":"70ab43d643fdeedd05fa9c681ef824045bc3ae140af76f405da83d6071fc4a63"} Oct 01 13:38:44 crc kubenswrapper[4851]: I1001 13:38:44.403985 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pqsbf" event={"ID":"bcde5055-0260-4621-b8f6-ee46c09720de","Type":"ContainerStarted","Data":"9cc0759b43dbb88cc9d725afabe942f3fb3fc36b5c61f8a1b0608584377b045a"} Oct 01 13:38:46 crc kubenswrapper[4851]: I1001 13:38:46.424651 4851 generic.go:334] "Generic (PLEG): container finished" podID="bcde5055-0260-4621-b8f6-ee46c09720de" containerID="72e6e1ad581ee74a925c6117f2db79415a1f2e15b01c2c949ae0343cc8dbc741" exitCode=0 Oct 01 13:38:46 crc kubenswrapper[4851]: I1001 13:38:46.424738 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pqsbf" event={"ID":"bcde5055-0260-4621-b8f6-ee46c09720de","Type":"ContainerDied","Data":"72e6e1ad581ee74a925c6117f2db79415a1f2e15b01c2c949ae0343cc8dbc741"} Oct 01 13:38:47 crc kubenswrapper[4851]: I1001 13:38:47.440953 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pqsbf" event={"ID":"bcde5055-0260-4621-b8f6-ee46c09720de","Type":"ContainerStarted","Data":"7470bb5b66c4f658386d73de8db501a2f35d9419ce0b2cb1975701ae6e45464f"} Oct 01 13:38:47 crc kubenswrapper[4851]: I1001 13:38:47.475277 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pqsbf" podStartSLOduration=2.821246096 podStartE2EDuration="5.475254426s" podCreationTimestamp="2025-10-01 13:38:42 +0000 UTC" firstStartedPulling="2025-10-01 13:38:44.40682382 +0000 UTC m=+2732.751941346" lastFinishedPulling="2025-10-01 13:38:47.06083219 +0000 UTC m=+2735.405949676" observedRunningTime="2025-10-01 13:38:47.463285044 +0000 UTC m=+2735.808402540" watchObservedRunningTime="2025-10-01 13:38:47.475254426 +0000 UTC m=+2735.820371912" Oct 01 13:38:52 crc kubenswrapper[4851]: I1001 13:38:52.940649 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:52 crc kubenswrapper[4851]: I1001 13:38:52.941259 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:53 crc kubenswrapper[4851]: I1001 13:38:53.027742 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:53 crc kubenswrapper[4851]: I1001 13:38:53.549128 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:54 crc kubenswrapper[4851]: I1001 13:38:54.594705 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pqsbf"] Oct 01 13:38:55 crc kubenswrapper[4851]: I1001 13:38:55.524091 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pqsbf" podUID="bcde5055-0260-4621-b8f6-ee46c09720de" containerName="registry-server" containerID="cri-o://7470bb5b66c4f658386d73de8db501a2f35d9419ce0b2cb1975701ae6e45464f" gracePeriod=2 Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.079718 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.227264 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcde5055-0260-4621-b8f6-ee46c09720de-catalog-content\") pod \"bcde5055-0260-4621-b8f6-ee46c09720de\" (UID: \"bcde5055-0260-4621-b8f6-ee46c09720de\") " Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.227559 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps76p\" (UniqueName: \"kubernetes.io/projected/bcde5055-0260-4621-b8f6-ee46c09720de-kube-api-access-ps76p\") pod \"bcde5055-0260-4621-b8f6-ee46c09720de\" (UID: \"bcde5055-0260-4621-b8f6-ee46c09720de\") " Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.227781 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcde5055-0260-4621-b8f6-ee46c09720de-utilities\") pod \"bcde5055-0260-4621-b8f6-ee46c09720de\" (UID: \"bcde5055-0260-4621-b8f6-ee46c09720de\") " Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.233350 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcde5055-0260-4621-b8f6-ee46c09720de-utilities" (OuterVolumeSpecName: "utilities") pod "bcde5055-0260-4621-b8f6-ee46c09720de" (UID: "bcde5055-0260-4621-b8f6-ee46c09720de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.244763 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcde5055-0260-4621-b8f6-ee46c09720de-kube-api-access-ps76p" (OuterVolumeSpecName: "kube-api-access-ps76p") pod "bcde5055-0260-4621-b8f6-ee46c09720de" (UID: "bcde5055-0260-4621-b8f6-ee46c09720de"). InnerVolumeSpecName "kube-api-access-ps76p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.329807 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps76p\" (UniqueName: \"kubernetes.io/projected/bcde5055-0260-4621-b8f6-ee46c09720de-kube-api-access-ps76p\") on node \"crc\" DevicePath \"\"" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.329846 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcde5055-0260-4621-b8f6-ee46c09720de-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.539542 4851 generic.go:334] "Generic (PLEG): container finished" podID="bcde5055-0260-4621-b8f6-ee46c09720de" containerID="7470bb5b66c4f658386d73de8db501a2f35d9419ce0b2cb1975701ae6e45464f" exitCode=0 Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.539600 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pqsbf" event={"ID":"bcde5055-0260-4621-b8f6-ee46c09720de","Type":"ContainerDied","Data":"7470bb5b66c4f658386d73de8db501a2f35d9419ce0b2cb1975701ae6e45464f"} Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.539631 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pqsbf" event={"ID":"bcde5055-0260-4621-b8f6-ee46c09720de","Type":"ContainerDied","Data":"9cc0759b43dbb88cc9d725afabe942f3fb3fc36b5c61f8a1b0608584377b045a"} Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.539650 4851 scope.go:117] "RemoveContainer" containerID="7470bb5b66c4f658386d73de8db501a2f35d9419ce0b2cb1975701ae6e45464f" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.539701 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pqsbf" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.577163 4851 scope.go:117] "RemoveContainer" containerID="72e6e1ad581ee74a925c6117f2db79415a1f2e15b01c2c949ae0343cc8dbc741" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.602960 4851 scope.go:117] "RemoveContainer" containerID="70ab43d643fdeedd05fa9c681ef824045bc3ae140af76f405da83d6071fc4a63" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.645563 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcde5055-0260-4621-b8f6-ee46c09720de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcde5055-0260-4621-b8f6-ee46c09720de" (UID: "bcde5055-0260-4621-b8f6-ee46c09720de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.656776 4851 scope.go:117] "RemoveContainer" containerID="7470bb5b66c4f658386d73de8db501a2f35d9419ce0b2cb1975701ae6e45464f" Oct 01 13:38:56 crc kubenswrapper[4851]: E1001 13:38:56.657378 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7470bb5b66c4f658386d73de8db501a2f35d9419ce0b2cb1975701ae6e45464f\": container with ID starting with 7470bb5b66c4f658386d73de8db501a2f35d9419ce0b2cb1975701ae6e45464f not found: ID does not exist" containerID="7470bb5b66c4f658386d73de8db501a2f35d9419ce0b2cb1975701ae6e45464f" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.657607 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7470bb5b66c4f658386d73de8db501a2f35d9419ce0b2cb1975701ae6e45464f"} err="failed to get container status \"7470bb5b66c4f658386d73de8db501a2f35d9419ce0b2cb1975701ae6e45464f\": rpc error: code = NotFound desc = could not find container \"7470bb5b66c4f658386d73de8db501a2f35d9419ce0b2cb1975701ae6e45464f\": container with ID starting with 7470bb5b66c4f658386d73de8db501a2f35d9419ce0b2cb1975701ae6e45464f not found: ID does not exist" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.657773 4851 scope.go:117] "RemoveContainer" containerID="72e6e1ad581ee74a925c6117f2db79415a1f2e15b01c2c949ae0343cc8dbc741" Oct 01 13:38:56 crc kubenswrapper[4851]: E1001 13:38:56.658244 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72e6e1ad581ee74a925c6117f2db79415a1f2e15b01c2c949ae0343cc8dbc741\": container with ID starting with 72e6e1ad581ee74a925c6117f2db79415a1f2e15b01c2c949ae0343cc8dbc741 not found: ID does not exist" containerID="72e6e1ad581ee74a925c6117f2db79415a1f2e15b01c2c949ae0343cc8dbc741" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.658285 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72e6e1ad581ee74a925c6117f2db79415a1f2e15b01c2c949ae0343cc8dbc741"} err="failed to get container status \"72e6e1ad581ee74a925c6117f2db79415a1f2e15b01c2c949ae0343cc8dbc741\": rpc error: code = NotFound desc = could not find container \"72e6e1ad581ee74a925c6117f2db79415a1f2e15b01c2c949ae0343cc8dbc741\": container with ID starting with 72e6e1ad581ee74a925c6117f2db79415a1f2e15b01c2c949ae0343cc8dbc741 not found: ID does not exist" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.658317 4851 scope.go:117] "RemoveContainer" containerID="70ab43d643fdeedd05fa9c681ef824045bc3ae140af76f405da83d6071fc4a63" Oct 01 13:38:56 crc kubenswrapper[4851]: E1001 13:38:56.658786 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ab43d643fdeedd05fa9c681ef824045bc3ae140af76f405da83d6071fc4a63\": container with ID starting with 70ab43d643fdeedd05fa9c681ef824045bc3ae140af76f405da83d6071fc4a63 not found: ID does not exist" containerID="70ab43d643fdeedd05fa9c681ef824045bc3ae140af76f405da83d6071fc4a63" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.658934 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ab43d643fdeedd05fa9c681ef824045bc3ae140af76f405da83d6071fc4a63"} err="failed to get container status \"70ab43d643fdeedd05fa9c681ef824045bc3ae140af76f405da83d6071fc4a63\": rpc error: code = NotFound desc = could not find container \"70ab43d643fdeedd05fa9c681ef824045bc3ae140af76f405da83d6071fc4a63\": container with ID starting with 70ab43d643fdeedd05fa9c681ef824045bc3ae140af76f405da83d6071fc4a63 not found: ID does not exist" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.738765 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcde5055-0260-4621-b8f6-ee46c09720de-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.873211 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pqsbf"] Oct 01 13:38:56 crc kubenswrapper[4851]: I1001 13:38:56.881299 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pqsbf"] Oct 01 13:38:58 crc kubenswrapper[4851]: I1001 13:38:58.338825 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcde5055-0260-4621-b8f6-ee46c09720de" path="/var/lib/kubelet/pods/bcde5055-0260-4621-b8f6-ee46c09720de/volumes" Oct 01 13:39:00 crc kubenswrapper[4851]: I1001 13:39:00.050004 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:39:00 crc kubenswrapper[4851]: I1001 13:39:00.050366 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:39:23 crc kubenswrapper[4851]: I1001 13:39:23.806692 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6dnsw"] Oct 01 13:39:23 crc kubenswrapper[4851]: E1001 13:39:23.807641 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcde5055-0260-4621-b8f6-ee46c09720de" containerName="extract-utilities" Oct 01 13:39:23 crc kubenswrapper[4851]: I1001 13:39:23.807655 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcde5055-0260-4621-b8f6-ee46c09720de" containerName="extract-utilities" Oct 01 13:39:23 crc kubenswrapper[4851]: E1001 13:39:23.807685 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcde5055-0260-4621-b8f6-ee46c09720de" containerName="extract-content" Oct 01 13:39:23 crc kubenswrapper[4851]: I1001 13:39:23.807692 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcde5055-0260-4621-b8f6-ee46c09720de" containerName="extract-content" Oct 01 13:39:23 crc kubenswrapper[4851]: E1001 13:39:23.807704 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcde5055-0260-4621-b8f6-ee46c09720de" containerName="registry-server" Oct 01 13:39:23 crc kubenswrapper[4851]: I1001 13:39:23.807710 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcde5055-0260-4621-b8f6-ee46c09720de" containerName="registry-server" Oct 01 13:39:23 crc kubenswrapper[4851]: I1001 13:39:23.807917 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcde5055-0260-4621-b8f6-ee46c09720de" containerName="registry-server" Oct 01 13:39:23 crc kubenswrapper[4851]: I1001 13:39:23.810356 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:23 crc kubenswrapper[4851]: I1001 13:39:23.843317 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6dnsw"] Oct 01 13:39:23 crc kubenswrapper[4851]: I1001 13:39:23.917788 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-utilities\") pod \"redhat-operators-6dnsw\" (UID: \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\") " pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:23 crc kubenswrapper[4851]: I1001 13:39:23.917937 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfv2d\" (UniqueName: \"kubernetes.io/projected/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-kube-api-access-hfv2d\") pod \"redhat-operators-6dnsw\" (UID: \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\") " pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:23 crc kubenswrapper[4851]: I1001 13:39:23.918019 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-catalog-content\") pod \"redhat-operators-6dnsw\" (UID: \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\") " pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:24 crc kubenswrapper[4851]: I1001 13:39:24.019447 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfv2d\" (UniqueName: \"kubernetes.io/projected/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-kube-api-access-hfv2d\") pod \"redhat-operators-6dnsw\" (UID: \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\") " pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:24 crc kubenswrapper[4851]: I1001 13:39:24.019565 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-catalog-content\") pod \"redhat-operators-6dnsw\" (UID: \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\") " pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:24 crc kubenswrapper[4851]: I1001 13:39:24.019634 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-utilities\") pod \"redhat-operators-6dnsw\" (UID: \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\") " pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:24 crc kubenswrapper[4851]: I1001 13:39:24.020079 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-utilities\") pod \"redhat-operators-6dnsw\" (UID: \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\") " pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:24 crc kubenswrapper[4851]: I1001 13:39:24.020171 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-catalog-content\") pod \"redhat-operators-6dnsw\" (UID: \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\") " pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:24 crc kubenswrapper[4851]: I1001 13:39:24.039599 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfv2d\" (UniqueName: \"kubernetes.io/projected/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-kube-api-access-hfv2d\") pod \"redhat-operators-6dnsw\" (UID: \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\") " pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:24 crc kubenswrapper[4851]: I1001 13:39:24.141417 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:24 crc kubenswrapper[4851]: I1001 13:39:24.625746 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6dnsw"] Oct 01 13:39:24 crc kubenswrapper[4851]: I1001 13:39:24.839479 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dnsw" event={"ID":"833bc1ff-abf6-45c3-899d-46b15ee1a6f1","Type":"ContainerStarted","Data":"51ab7ac7798a735a00b6e61b595554bbdc02fc93c8538d912c5f352985502cc3"} Oct 01 13:39:25 crc kubenswrapper[4851]: I1001 13:39:25.852474 4851 generic.go:334] "Generic (PLEG): container finished" podID="833bc1ff-abf6-45c3-899d-46b15ee1a6f1" containerID="24de540bd66c1e7d21c9e8d2fb9da9a532fd05df0c6ae1cce2341e6fd656ff73" exitCode=0 Oct 01 13:39:25 crc kubenswrapper[4851]: I1001 13:39:25.852541 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dnsw" event={"ID":"833bc1ff-abf6-45c3-899d-46b15ee1a6f1","Type":"ContainerDied","Data":"24de540bd66c1e7d21c9e8d2fb9da9a532fd05df0c6ae1cce2341e6fd656ff73"} Oct 01 13:39:26 crc kubenswrapper[4851]: I1001 13:39:26.864149 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dnsw" event={"ID":"833bc1ff-abf6-45c3-899d-46b15ee1a6f1","Type":"ContainerStarted","Data":"6ecd83b1264b3d848fd188fbccb87848f3910be2fe0ef4e8f528f42801cf38ef"} Oct 01 13:39:27 crc kubenswrapper[4851]: I1001 13:39:27.875798 4851 generic.go:334] "Generic (PLEG): container finished" podID="833bc1ff-abf6-45c3-899d-46b15ee1a6f1" containerID="6ecd83b1264b3d848fd188fbccb87848f3910be2fe0ef4e8f528f42801cf38ef" exitCode=0 Oct 01 13:39:27 crc kubenswrapper[4851]: I1001 13:39:27.876065 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dnsw" event={"ID":"833bc1ff-abf6-45c3-899d-46b15ee1a6f1","Type":"ContainerDied","Data":"6ecd83b1264b3d848fd188fbccb87848f3910be2fe0ef4e8f528f42801cf38ef"} Oct 01 13:39:28 crc kubenswrapper[4851]: I1001 13:39:28.893184 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dnsw" event={"ID":"833bc1ff-abf6-45c3-899d-46b15ee1a6f1","Type":"ContainerStarted","Data":"4600bf40d227745966d4241e9977952d767016375e854ffb5e5077bfbb208524"} Oct 01 13:39:28 crc kubenswrapper[4851]: I1001 13:39:28.918002 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6dnsw" podStartSLOduration=3.329128577 podStartE2EDuration="5.917986419s" podCreationTimestamp="2025-10-01 13:39:23 +0000 UTC" firstStartedPulling="2025-10-01 13:39:25.856295685 +0000 UTC m=+2774.201413181" lastFinishedPulling="2025-10-01 13:39:28.445153517 +0000 UTC m=+2776.790271023" observedRunningTime="2025-10-01 13:39:28.910401743 +0000 UTC m=+2777.255519229" watchObservedRunningTime="2025-10-01 13:39:28.917986419 +0000 UTC m=+2777.263103905" Oct 01 13:39:30 crc kubenswrapper[4851]: I1001 13:39:30.049859 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:39:30 crc kubenswrapper[4851]: I1001 13:39:30.049945 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:39:30 crc kubenswrapper[4851]: I1001 13:39:30.050003 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 13:39:30 crc kubenswrapper[4851]: I1001 13:39:30.050606 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c40691f1f8a44c2600070635532cee52f03666d706039454d1cc4f73511fa56"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:39:30 crc kubenswrapper[4851]: I1001 13:39:30.050976 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://8c40691f1f8a44c2600070635532cee52f03666d706039454d1cc4f73511fa56" gracePeriod=600 Oct 01 13:39:32 crc kubenswrapper[4851]: I1001 13:39:32.956123 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="8c40691f1f8a44c2600070635532cee52f03666d706039454d1cc4f73511fa56" exitCode=0 Oct 01 13:39:32 crc kubenswrapper[4851]: I1001 13:39:32.956179 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"8c40691f1f8a44c2600070635532cee52f03666d706039454d1cc4f73511fa56"} Oct 01 13:39:32 crc kubenswrapper[4851]: I1001 13:39:32.956218 4851 scope.go:117] "RemoveContainer" containerID="4fd0a1c4e3d3c11f18d105ee08137bd210729c00a20287314c83a2ce7ab9a5bb" Oct 01 13:39:33 crc kubenswrapper[4851]: I1001 13:39:33.968463 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f"} Oct 01 13:39:34 crc kubenswrapper[4851]: I1001 13:39:34.141920 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:34 crc kubenswrapper[4851]: I1001 13:39:34.141980 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:34 crc kubenswrapper[4851]: I1001 13:39:34.188613 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:35 crc kubenswrapper[4851]: I1001 13:39:35.023058 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:35 crc kubenswrapper[4851]: I1001 13:39:35.071941 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6dnsw"] Oct 01 13:39:37 crc kubenswrapper[4851]: I1001 13:39:37.002985 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6dnsw" podUID="833bc1ff-abf6-45c3-899d-46b15ee1a6f1" containerName="registry-server" containerID="cri-o://4600bf40d227745966d4241e9977952d767016375e854ffb5e5077bfbb208524" gracePeriod=2 Oct 01 13:39:37 crc kubenswrapper[4851]: I1001 13:39:37.504130 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:37 crc kubenswrapper[4851]: I1001 13:39:37.604158 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-catalog-content\") pod \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\" (UID: \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\") " Oct 01 13:39:37 crc kubenswrapper[4851]: I1001 13:39:37.604243 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-utilities\") pod \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\" (UID: \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\") " Oct 01 13:39:37 crc kubenswrapper[4851]: I1001 13:39:37.604406 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfv2d\" (UniqueName: \"kubernetes.io/projected/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-kube-api-access-hfv2d\") pod \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\" (UID: \"833bc1ff-abf6-45c3-899d-46b15ee1a6f1\") " Oct 01 13:39:37 crc kubenswrapper[4851]: I1001 13:39:37.605153 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-utilities" (OuterVolumeSpecName: "utilities") pod "833bc1ff-abf6-45c3-899d-46b15ee1a6f1" (UID: "833bc1ff-abf6-45c3-899d-46b15ee1a6f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:39:37 crc kubenswrapper[4851]: I1001 13:39:37.619562 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-kube-api-access-hfv2d" (OuterVolumeSpecName: "kube-api-access-hfv2d") pod "833bc1ff-abf6-45c3-899d-46b15ee1a6f1" (UID: "833bc1ff-abf6-45c3-899d-46b15ee1a6f1"). InnerVolumeSpecName "kube-api-access-hfv2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:39:37 crc kubenswrapper[4851]: I1001 13:39:37.698210 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "833bc1ff-abf6-45c3-899d-46b15ee1a6f1" (UID: "833bc1ff-abf6-45c3-899d-46b15ee1a6f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:39:37 crc kubenswrapper[4851]: I1001 13:39:37.706273 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:39:37 crc kubenswrapper[4851]: I1001 13:39:37.706304 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:39:37 crc kubenswrapper[4851]: I1001 13:39:37.706313 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfv2d\" (UniqueName: \"kubernetes.io/projected/833bc1ff-abf6-45c3-899d-46b15ee1a6f1-kube-api-access-hfv2d\") on node \"crc\" DevicePath \"\"" Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.016153 4851 generic.go:334] "Generic (PLEG): container finished" podID="833bc1ff-abf6-45c3-899d-46b15ee1a6f1" containerID="4600bf40d227745966d4241e9977952d767016375e854ffb5e5077bfbb208524" exitCode=0 Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.016197 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dnsw" event={"ID":"833bc1ff-abf6-45c3-899d-46b15ee1a6f1","Type":"ContainerDied","Data":"4600bf40d227745966d4241e9977952d767016375e854ffb5e5077bfbb208524"} Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.016225 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dnsw" event={"ID":"833bc1ff-abf6-45c3-899d-46b15ee1a6f1","Type":"ContainerDied","Data":"51ab7ac7798a735a00b6e61b595554bbdc02fc93c8538d912c5f352985502cc3"} Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.016237 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dnsw" Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.016248 4851 scope.go:117] "RemoveContainer" containerID="4600bf40d227745966d4241e9977952d767016375e854ffb5e5077bfbb208524" Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.053187 4851 scope.go:117] "RemoveContainer" containerID="6ecd83b1264b3d848fd188fbccb87848f3910be2fe0ef4e8f528f42801cf38ef" Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.081548 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6dnsw"] Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.092794 4851 scope.go:117] "RemoveContainer" containerID="24de540bd66c1e7d21c9e8d2fb9da9a532fd05df0c6ae1cce2341e6fd656ff73" Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.139108 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6dnsw"] Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.141796 4851 scope.go:117] "RemoveContainer" containerID="4600bf40d227745966d4241e9977952d767016375e854ffb5e5077bfbb208524" Oct 01 13:39:38 crc kubenswrapper[4851]: E1001 13:39:38.142241 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4600bf40d227745966d4241e9977952d767016375e854ffb5e5077bfbb208524\": container with ID starting with 4600bf40d227745966d4241e9977952d767016375e854ffb5e5077bfbb208524 not found: ID does not exist" containerID="4600bf40d227745966d4241e9977952d767016375e854ffb5e5077bfbb208524" Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.142299 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4600bf40d227745966d4241e9977952d767016375e854ffb5e5077bfbb208524"} err="failed to get container status \"4600bf40d227745966d4241e9977952d767016375e854ffb5e5077bfbb208524\": rpc error: code = NotFound desc = could not find container \"4600bf40d227745966d4241e9977952d767016375e854ffb5e5077bfbb208524\": container with ID starting with 4600bf40d227745966d4241e9977952d767016375e854ffb5e5077bfbb208524 not found: ID does not exist" Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.142332 4851 scope.go:117] "RemoveContainer" containerID="6ecd83b1264b3d848fd188fbccb87848f3910be2fe0ef4e8f528f42801cf38ef" Oct 01 13:39:38 crc kubenswrapper[4851]: E1001 13:39:38.142763 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ecd83b1264b3d848fd188fbccb87848f3910be2fe0ef4e8f528f42801cf38ef\": container with ID starting with 6ecd83b1264b3d848fd188fbccb87848f3910be2fe0ef4e8f528f42801cf38ef not found: ID does not exist" containerID="6ecd83b1264b3d848fd188fbccb87848f3910be2fe0ef4e8f528f42801cf38ef" Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.142802 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ecd83b1264b3d848fd188fbccb87848f3910be2fe0ef4e8f528f42801cf38ef"} err="failed to get container status \"6ecd83b1264b3d848fd188fbccb87848f3910be2fe0ef4e8f528f42801cf38ef\": rpc error: code = NotFound desc = could not find container \"6ecd83b1264b3d848fd188fbccb87848f3910be2fe0ef4e8f528f42801cf38ef\": container with ID starting with 6ecd83b1264b3d848fd188fbccb87848f3910be2fe0ef4e8f528f42801cf38ef not found: ID does not exist" Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.142826 4851 scope.go:117] "RemoveContainer" containerID="24de540bd66c1e7d21c9e8d2fb9da9a532fd05df0c6ae1cce2341e6fd656ff73" Oct 01 13:39:38 crc kubenswrapper[4851]: E1001 13:39:38.143123 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24de540bd66c1e7d21c9e8d2fb9da9a532fd05df0c6ae1cce2341e6fd656ff73\": container with ID starting with 24de540bd66c1e7d21c9e8d2fb9da9a532fd05df0c6ae1cce2341e6fd656ff73 not found: ID does not exist" containerID="24de540bd66c1e7d21c9e8d2fb9da9a532fd05df0c6ae1cce2341e6fd656ff73" Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.143190 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24de540bd66c1e7d21c9e8d2fb9da9a532fd05df0c6ae1cce2341e6fd656ff73"} err="failed to get container status \"24de540bd66c1e7d21c9e8d2fb9da9a532fd05df0c6ae1cce2341e6fd656ff73\": rpc error: code = NotFound desc = could not find container \"24de540bd66c1e7d21c9e8d2fb9da9a532fd05df0c6ae1cce2341e6fd656ff73\": container with ID starting with 24de540bd66c1e7d21c9e8d2fb9da9a532fd05df0c6ae1cce2341e6fd656ff73 not found: ID does not exist" Oct 01 13:39:38 crc kubenswrapper[4851]: I1001 13:39:38.338963 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="833bc1ff-abf6-45c3-899d-46b15ee1a6f1" path="/var/lib/kubelet/pods/833bc1ff-abf6-45c3-899d-46b15ee1a6f1/volumes" Oct 01 13:41:18 crc kubenswrapper[4851]: I1001 13:41:18.424831 4851 scope.go:117] "RemoveContainer" containerID="f40ab020ee23f565cd919684c4aec00d488898d5a33696a64f20d969c80e54cb" Oct 01 13:41:18 crc kubenswrapper[4851]: I1001 13:41:18.454715 4851 scope.go:117] "RemoveContainer" containerID="a734c1ca19a257c3e38cd85d1abe7c46ef8a47ff828703749d24d3355c72154f" Oct 01 13:41:18 crc kubenswrapper[4851]: I1001 13:41:18.478237 4851 scope.go:117] "RemoveContainer" containerID="1ddd95c950b8bc084fc940ff2128b93b967a20e6574e48e5c191ea87892d231e" Oct 01 13:41:22 crc kubenswrapper[4851]: I1001 13:41:22.137957 4851 generic.go:334] "Generic (PLEG): container finished" podID="75a874c6-cd94-467a-ab74-bede44646604" containerID="d160465148f3bd699a84e719b81fdb95aa24a27c55a7a80047703b9220d33af8" exitCode=0 Oct 01 13:41:22 crc kubenswrapper[4851]: I1001 13:41:22.138145 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" event={"ID":"75a874c6-cd94-467a-ab74-bede44646604","Type":"ContainerDied","Data":"d160465148f3bd699a84e719b81fdb95aa24a27c55a7a80047703b9220d33af8"} Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.571071 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.678035 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-combined-ca-bundle\") pod \"75a874c6-cd94-467a-ab74-bede44646604\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.678236 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-ssh-key\") pod \"75a874c6-cd94-467a-ab74-bede44646604\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.678314 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-cell1-compute-config-1\") pod \"75a874c6-cd94-467a-ab74-bede44646604\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.678442 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-migration-ssh-key-1\") pod \"75a874c6-cd94-467a-ab74-bede44646604\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.678473 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-inventory\") pod \"75a874c6-cd94-467a-ab74-bede44646604\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.678563 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzbzb\" (UniqueName: \"kubernetes.io/projected/75a874c6-cd94-467a-ab74-bede44646604-kube-api-access-vzbzb\") pod \"75a874c6-cd94-467a-ab74-bede44646604\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.678609 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-migration-ssh-key-0\") pod \"75a874c6-cd94-467a-ab74-bede44646604\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.678643 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-cell1-compute-config-0\") pod \"75a874c6-cd94-467a-ab74-bede44646604\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.678693 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/75a874c6-cd94-467a-ab74-bede44646604-nova-extra-config-0\") pod \"75a874c6-cd94-467a-ab74-bede44646604\" (UID: \"75a874c6-cd94-467a-ab74-bede44646604\") " Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.684641 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75a874c6-cd94-467a-ab74-bede44646604-kube-api-access-vzbzb" (OuterVolumeSpecName: "kube-api-access-vzbzb") pod "75a874c6-cd94-467a-ab74-bede44646604" (UID: "75a874c6-cd94-467a-ab74-bede44646604"). InnerVolumeSpecName "kube-api-access-vzbzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.684974 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "75a874c6-cd94-467a-ab74-bede44646604" (UID: "75a874c6-cd94-467a-ab74-bede44646604"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.707384 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "75a874c6-cd94-467a-ab74-bede44646604" (UID: "75a874c6-cd94-467a-ab74-bede44646604"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.708357 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "75a874c6-cd94-467a-ab74-bede44646604" (UID: "75a874c6-cd94-467a-ab74-bede44646604"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.709981 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-inventory" (OuterVolumeSpecName: "inventory") pod "75a874c6-cd94-467a-ab74-bede44646604" (UID: "75a874c6-cd94-467a-ab74-bede44646604"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.711392 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "75a874c6-cd94-467a-ab74-bede44646604" (UID: "75a874c6-cd94-467a-ab74-bede44646604"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.717349 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "75a874c6-cd94-467a-ab74-bede44646604" (UID: "75a874c6-cd94-467a-ab74-bede44646604"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.721340 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "75a874c6-cd94-467a-ab74-bede44646604" (UID: "75a874c6-cd94-467a-ab74-bede44646604"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.723832 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75a874c6-cd94-467a-ab74-bede44646604-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "75a874c6-cd94-467a-ab74-bede44646604" (UID: "75a874c6-cd94-467a-ab74-bede44646604"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.781101 4851 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/75a874c6-cd94-467a-ab74-bede44646604-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.781151 4851 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.781166 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.781178 4851 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.781193 4851 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.781208 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.781221 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzbzb\" (UniqueName: \"kubernetes.io/projected/75a874c6-cd94-467a-ab74-bede44646604-kube-api-access-vzbzb\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.781233 4851 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:23 crc kubenswrapper[4851]: I1001 13:41:23.781245 4851 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/75a874c6-cd94-467a-ab74-bede44646604-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.168177 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" event={"ID":"75a874c6-cd94-467a-ab74-bede44646604","Type":"ContainerDied","Data":"7a3de0685213bcc67c692d9181113650e239a37585f8b844171df08e915731f9"} Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.168231 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a3de0685213bcc67c692d9181113650e239a37585f8b844171df08e915731f9" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.168315 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5sjkx" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.256967 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk"] Oct 01 13:41:24 crc kubenswrapper[4851]: E1001 13:41:24.257607 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833bc1ff-abf6-45c3-899d-46b15ee1a6f1" containerName="extract-utilities" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.257633 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="833bc1ff-abf6-45c3-899d-46b15ee1a6f1" containerName="extract-utilities" Oct 01 13:41:24 crc kubenswrapper[4851]: E1001 13:41:24.257658 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833bc1ff-abf6-45c3-899d-46b15ee1a6f1" containerName="extract-content" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.257670 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="833bc1ff-abf6-45c3-899d-46b15ee1a6f1" containerName="extract-content" Oct 01 13:41:24 crc kubenswrapper[4851]: E1001 13:41:24.257691 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833bc1ff-abf6-45c3-899d-46b15ee1a6f1" containerName="registry-server" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.257702 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="833bc1ff-abf6-45c3-899d-46b15ee1a6f1" containerName="registry-server" Oct 01 13:41:24 crc kubenswrapper[4851]: E1001 13:41:24.257742 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a874c6-cd94-467a-ab74-bede44646604" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.257754 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a874c6-cd94-467a-ab74-bede44646604" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.258109 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="833bc1ff-abf6-45c3-899d-46b15ee1a6f1" containerName="registry-server" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.258170 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="75a874c6-cd94-467a-ab74-bede44646604" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.259163 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.261247 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.262070 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2tz4d" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.263300 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.264382 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.264587 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.265239 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk"] Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.395445 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.395700 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.395776 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.395829 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.395928 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.395977 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.396039 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx725\" (UniqueName: \"kubernetes.io/projected/bd7a7617-48e5-42a8-9630-ce17e87cde69-kube-api-access-hx725\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.497902 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.498031 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.498082 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.498212 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.499025 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.499115 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx725\" (UniqueName: \"kubernetes.io/projected/bd7a7617-48e5-42a8-9630-ce17e87cde69-kube-api-access-hx725\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.499265 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.503354 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.503599 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.503807 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.503859 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.505182 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.508956 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.526003 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx725\" (UniqueName: \"kubernetes.io/projected/bd7a7617-48e5-42a8-9630-ce17e87cde69-kube-api-access-hx725\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kznqk\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:24 crc kubenswrapper[4851]: I1001 13:41:24.582571 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:41:25 crc kubenswrapper[4851]: I1001 13:41:25.173377 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk"] Oct 01 13:41:25 crc kubenswrapper[4851]: I1001 13:41:25.184378 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:41:26 crc kubenswrapper[4851]: I1001 13:41:26.195458 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" event={"ID":"bd7a7617-48e5-42a8-9630-ce17e87cde69","Type":"ContainerStarted","Data":"10901628554f186c1d024d4db8a77dfa93fbc1f7b6bdf0ef8d45d9ee8d0fcdb8"} Oct 01 13:41:26 crc kubenswrapper[4851]: I1001 13:41:26.195940 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" event={"ID":"bd7a7617-48e5-42a8-9630-ce17e87cde69","Type":"ContainerStarted","Data":"7e2143051108c63a4492c1a3df44b800c012ca74479ff40c7c0b4f8d1089a557"} Oct 01 13:41:26 crc kubenswrapper[4851]: I1001 13:41:26.252170 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" podStartSLOduration=1.7528760110000001 podStartE2EDuration="2.252152468s" podCreationTimestamp="2025-10-01 13:41:24 +0000 UTC" firstStartedPulling="2025-10-01 13:41:25.18402486 +0000 UTC m=+2893.529142346" lastFinishedPulling="2025-10-01 13:41:25.683301277 +0000 UTC m=+2894.028418803" observedRunningTime="2025-10-01 13:41:26.24310246 +0000 UTC m=+2894.588219946" watchObservedRunningTime="2025-10-01 13:41:26.252152468 +0000 UTC m=+2894.597269954" Oct 01 13:42:00 crc kubenswrapper[4851]: I1001 13:42:00.049743 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:42:00 crc kubenswrapper[4851]: I1001 13:42:00.050456 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:42:30 crc kubenswrapper[4851]: I1001 13:42:30.049817 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:42:30 crc kubenswrapper[4851]: I1001 13:42:30.050690 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:43:00 crc kubenswrapper[4851]: I1001 13:43:00.050484 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:43:00 crc kubenswrapper[4851]: I1001 13:43:00.051714 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:43:00 crc kubenswrapper[4851]: I1001 13:43:00.051788 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 13:43:00 crc kubenswrapper[4851]: I1001 13:43:00.052708 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:43:00 crc kubenswrapper[4851]: I1001 13:43:00.052785 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" gracePeriod=600 Oct 01 13:43:00 crc kubenswrapper[4851]: E1001 13:43:00.173071 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:43:00 crc kubenswrapper[4851]: I1001 13:43:00.238925 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" exitCode=0 Oct 01 13:43:00 crc kubenswrapper[4851]: I1001 13:43:00.238985 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f"} Oct 01 13:43:00 crc kubenswrapper[4851]: I1001 13:43:00.239041 4851 scope.go:117] "RemoveContainer" containerID="8c40691f1f8a44c2600070635532cee52f03666d706039454d1cc4f73511fa56" Oct 01 13:43:00 crc kubenswrapper[4851]: I1001 13:43:00.239892 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:43:00 crc kubenswrapper[4851]: E1001 13:43:00.240294 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:43:14 crc kubenswrapper[4851]: I1001 13:43:14.329239 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:43:14 crc kubenswrapper[4851]: E1001 13:43:14.330150 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:43:25 crc kubenswrapper[4851]: I1001 13:43:25.329200 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:43:25 crc kubenswrapper[4851]: E1001 13:43:25.330409 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:43:40 crc kubenswrapper[4851]: I1001 13:43:40.328376 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:43:40 crc kubenswrapper[4851]: E1001 13:43:40.329271 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:43:53 crc kubenswrapper[4851]: I1001 13:43:53.329368 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:43:53 crc kubenswrapper[4851]: E1001 13:43:53.330199 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:43:58 crc kubenswrapper[4851]: I1001 13:43:58.913608 4851 generic.go:334] "Generic (PLEG): container finished" podID="bd7a7617-48e5-42a8-9630-ce17e87cde69" containerID="10901628554f186c1d024d4db8a77dfa93fbc1f7b6bdf0ef8d45d9ee8d0fcdb8" exitCode=0 Oct 01 13:43:58 crc kubenswrapper[4851]: I1001 13:43:58.913727 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" event={"ID":"bd7a7617-48e5-42a8-9630-ce17e87cde69","Type":"ContainerDied","Data":"10901628554f186c1d024d4db8a77dfa93fbc1f7b6bdf0ef8d45d9ee8d0fcdb8"} Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.407955 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.571929 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx725\" (UniqueName: \"kubernetes.io/projected/bd7a7617-48e5-42a8-9630-ce17e87cde69-kube-api-access-hx725\") pod \"bd7a7617-48e5-42a8-9630-ce17e87cde69\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.572018 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-1\") pod \"bd7a7617-48e5-42a8-9630-ce17e87cde69\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.572078 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-inventory\") pod \"bd7a7617-48e5-42a8-9630-ce17e87cde69\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.572180 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-0\") pod \"bd7a7617-48e5-42a8-9630-ce17e87cde69\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.572209 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ssh-key\") pod \"bd7a7617-48e5-42a8-9630-ce17e87cde69\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.572331 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-telemetry-combined-ca-bundle\") pod \"bd7a7617-48e5-42a8-9630-ce17e87cde69\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.572386 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-2\") pod \"bd7a7617-48e5-42a8-9630-ce17e87cde69\" (UID: \"bd7a7617-48e5-42a8-9630-ce17e87cde69\") " Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.578577 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd7a7617-48e5-42a8-9630-ce17e87cde69-kube-api-access-hx725" (OuterVolumeSpecName: "kube-api-access-hx725") pod "bd7a7617-48e5-42a8-9630-ce17e87cde69" (UID: "bd7a7617-48e5-42a8-9630-ce17e87cde69"). InnerVolumeSpecName "kube-api-access-hx725". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.585897 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "bd7a7617-48e5-42a8-9630-ce17e87cde69" (UID: "bd7a7617-48e5-42a8-9630-ce17e87cde69"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.608073 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "bd7a7617-48e5-42a8-9630-ce17e87cde69" (UID: "bd7a7617-48e5-42a8-9630-ce17e87cde69"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.609679 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "bd7a7617-48e5-42a8-9630-ce17e87cde69" (UID: "bd7a7617-48e5-42a8-9630-ce17e87cde69"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.611910 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "bd7a7617-48e5-42a8-9630-ce17e87cde69" (UID: "bd7a7617-48e5-42a8-9630-ce17e87cde69"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.622464 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bd7a7617-48e5-42a8-9630-ce17e87cde69" (UID: "bd7a7617-48e5-42a8-9630-ce17e87cde69"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.623208 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-inventory" (OuterVolumeSpecName: "inventory") pod "bd7a7617-48e5-42a8-9630-ce17e87cde69" (UID: "bd7a7617-48e5-42a8-9630-ce17e87cde69"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.675325 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx725\" (UniqueName: \"kubernetes.io/projected/bd7a7617-48e5-42a8-9630-ce17e87cde69-kube-api-access-hx725\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.675364 4851 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.675378 4851 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.675390 4851 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.675405 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.675415 4851 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.675426 4851 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bd7a7617-48e5-42a8-9630-ce17e87cde69-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.939847 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" event={"ID":"bd7a7617-48e5-42a8-9630-ce17e87cde69","Type":"ContainerDied","Data":"7e2143051108c63a4492c1a3df44b800c012ca74479ff40c7c0b4f8d1089a557"} Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.939946 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e2143051108c63a4492c1a3df44b800c012ca74479ff40c7c0b4f8d1089a557" Oct 01 13:44:00 crc kubenswrapper[4851]: I1001 13:44:00.940058 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kznqk" Oct 01 13:44:05 crc kubenswrapper[4851]: I1001 13:44:05.329428 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:44:05 crc kubenswrapper[4851]: E1001 13:44:05.334340 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:44:16 crc kubenswrapper[4851]: I1001 13:44:16.328907 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:44:16 crc kubenswrapper[4851]: E1001 13:44:16.329717 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:44:30 crc kubenswrapper[4851]: I1001 13:44:30.328440 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:44:30 crc kubenswrapper[4851]: E1001 13:44:30.329442 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:44:37 crc kubenswrapper[4851]: I1001 13:44:37.438907 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:44:37 crc kubenswrapper[4851]: I1001 13:44:37.439594 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="thanos-sidecar" containerID="cri-o://be33d65d4e1fa8a9798a0fbebead9be31a2ceabe441b0fbc11a7cf73b6d3bae9" gracePeriod=600 Oct 01 13:44:37 crc kubenswrapper[4851]: I1001 13:44:37.439758 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="config-reloader" containerID="cri-o://f5f5da7e72201369b37ad67af4f52722232c57eca697c7135aebd305ad5b96f4" gracePeriod=600 Oct 01 13:44:37 crc kubenswrapper[4851]: I1001 13:44:37.439951 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="prometheus" containerID="cri-o://caf8b1977df89ba68e1cbf9d955d42f5ae1587d0eaf188335c919e81164ed6ea" gracePeriod=600 Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.366454 4851 generic.go:334] "Generic (PLEG): container finished" podID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerID="be33d65d4e1fa8a9798a0fbebead9be31a2ceabe441b0fbc11a7cf73b6d3bae9" exitCode=0 Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.366829 4851 generic.go:334] "Generic (PLEG): container finished" podID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerID="f5f5da7e72201369b37ad67af4f52722232c57eca697c7135aebd305ad5b96f4" exitCode=0 Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.366845 4851 generic.go:334] "Generic (PLEG): container finished" podID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerID="caf8b1977df89ba68e1cbf9d955d42f5ae1587d0eaf188335c919e81164ed6ea" exitCode=0 Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.366623 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cde88b6a-b10f-4282-9f02-ad48a766a911","Type":"ContainerDied","Data":"be33d65d4e1fa8a9798a0fbebead9be31a2ceabe441b0fbc11a7cf73b6d3bae9"} Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.366890 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cde88b6a-b10f-4282-9f02-ad48a766a911","Type":"ContainerDied","Data":"f5f5da7e72201369b37ad67af4f52722232c57eca697c7135aebd305ad5b96f4"} Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.366908 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cde88b6a-b10f-4282-9f02-ad48a766a911","Type":"ContainerDied","Data":"caf8b1977df89ba68e1cbf9d955d42f5ae1587d0eaf188335c919e81164ed6ea"} Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.366922 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cde88b6a-b10f-4282-9f02-ad48a766a911","Type":"ContainerDied","Data":"dc323bd3fd6c37f9c2367dd83735e1183bcdb3249eb1d97445cfa3a79c3ddf16"} Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.366933 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc323bd3fd6c37f9c2367dd83735e1183bcdb3249eb1d97445cfa3a79c3ddf16" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.418084 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.526733 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cde88b6a-b10f-4282-9f02-ad48a766a911-config-out\") pod \"cde88b6a-b10f-4282-9f02-ad48a766a911\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.526825 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cde88b6a-b10f-4282-9f02-ad48a766a911-prometheus-metric-storage-rulefiles-0\") pod \"cde88b6a-b10f-4282-9f02-ad48a766a911\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.526853 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config\") pod \"cde88b6a-b10f-4282-9f02-ad48a766a911\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.526870 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-config\") pod \"cde88b6a-b10f-4282-9f02-ad48a766a911\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.526921 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"cde88b6a-b10f-4282-9f02-ad48a766a911\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.526945 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr88m\" (UniqueName: \"kubernetes.io/projected/cde88b6a-b10f-4282-9f02-ad48a766a911-kube-api-access-wr88m\") pod \"cde88b6a-b10f-4282-9f02-ad48a766a911\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.526992 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cde88b6a-b10f-4282-9f02-ad48a766a911-tls-assets\") pod \"cde88b6a-b10f-4282-9f02-ad48a766a911\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.527021 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"cde88b6a-b10f-4282-9f02-ad48a766a911\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.527713 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") pod \"cde88b6a-b10f-4282-9f02-ad48a766a911\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.527785 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-secret-combined-ca-bundle\") pod \"cde88b6a-b10f-4282-9f02-ad48a766a911\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.527822 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-thanos-prometheus-http-client-file\") pod \"cde88b6a-b10f-4282-9f02-ad48a766a911\" (UID: \"cde88b6a-b10f-4282-9f02-ad48a766a911\") " Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.529278 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cde88b6a-b10f-4282-9f02-ad48a766a911-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "cde88b6a-b10f-4282-9f02-ad48a766a911" (UID: "cde88b6a-b10f-4282-9f02-ad48a766a911"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.534121 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cde88b6a-b10f-4282-9f02-ad48a766a911-kube-api-access-wr88m" (OuterVolumeSpecName: "kube-api-access-wr88m") pod "cde88b6a-b10f-4282-9f02-ad48a766a911" (UID: "cde88b6a-b10f-4282-9f02-ad48a766a911"). InnerVolumeSpecName "kube-api-access-wr88m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.534634 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cde88b6a-b10f-4282-9f02-ad48a766a911-config-out" (OuterVolumeSpecName: "config-out") pod "cde88b6a-b10f-4282-9f02-ad48a766a911" (UID: "cde88b6a-b10f-4282-9f02-ad48a766a911"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.535597 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cde88b6a-b10f-4282-9f02-ad48a766a911-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "cde88b6a-b10f-4282-9f02-ad48a766a911" (UID: "cde88b6a-b10f-4282-9f02-ad48a766a911"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.546598 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "cde88b6a-b10f-4282-9f02-ad48a766a911" (UID: "cde88b6a-b10f-4282-9f02-ad48a766a911"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.546873 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "cde88b6a-b10f-4282-9f02-ad48a766a911" (UID: "cde88b6a-b10f-4282-9f02-ad48a766a911"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.552227 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "cde88b6a-b10f-4282-9f02-ad48a766a911" (UID: "cde88b6a-b10f-4282-9f02-ad48a766a911"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.552771 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "cde88b6a-b10f-4282-9f02-ad48a766a911" (UID: "cde88b6a-b10f-4282-9f02-ad48a766a911"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.554069 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-config" (OuterVolumeSpecName: "config") pod "cde88b6a-b10f-4282-9f02-ad48a766a911" (UID: "cde88b6a-b10f-4282-9f02-ad48a766a911"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.576311 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "cde88b6a-b10f-4282-9f02-ad48a766a911" (UID: "cde88b6a-b10f-4282-9f02-ad48a766a911"). InnerVolumeSpecName "pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.618704 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config" (OuterVolumeSpecName: "web-config") pod "cde88b6a-b10f-4282-9f02-ad48a766a911" (UID: "cde88b6a-b10f-4282-9f02-ad48a766a911"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.634433 4851 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") on node \"crc\" " Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.634472 4851 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.634488 4851 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.634591 4851 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cde88b6a-b10f-4282-9f02-ad48a766a911-config-out\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.634608 4851 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cde88b6a-b10f-4282-9f02-ad48a766a911-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.634624 4851 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.634636 4851 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.634651 4851 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.634666 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr88m\" (UniqueName: \"kubernetes.io/projected/cde88b6a-b10f-4282-9f02-ad48a766a911-kube-api-access-wr88m\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.634678 4851 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cde88b6a-b10f-4282-9f02-ad48a766a911-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.634689 4851 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cde88b6a-b10f-4282-9f02-ad48a766a911-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.700430 4851 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.702619 4851 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc") on node "crc" Oct 01 13:44:38 crc kubenswrapper[4851]: I1001 13:44:38.735963 4851 reconciler_common.go:293] "Volume detached for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.374709 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.416523 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.426642 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.445311 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:44:39 crc kubenswrapper[4851]: E1001 13:44:39.445786 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="init-config-reloader" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.445807 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="init-config-reloader" Oct 01 13:44:39 crc kubenswrapper[4851]: E1001 13:44:39.445832 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="config-reloader" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.445843 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="config-reloader" Oct 01 13:44:39 crc kubenswrapper[4851]: E1001 13:44:39.445880 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="thanos-sidecar" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.445892 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="thanos-sidecar" Oct 01 13:44:39 crc kubenswrapper[4851]: E1001 13:44:39.445916 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="prometheus" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.445927 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="prometheus" Oct 01 13:44:39 crc kubenswrapper[4851]: E1001 13:44:39.445951 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd7a7617-48e5-42a8-9630-ce17e87cde69" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.445965 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd7a7617-48e5-42a8-9630-ce17e87cde69" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.446202 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="thanos-sidecar" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.446220 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="prometheus" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.446244 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd7a7617-48e5-42a8-9630-ce17e87cde69" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.446261 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="config-reloader" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.448575 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.452095 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.452335 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.452512 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.452929 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-vmjkn" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.453090 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.472342 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.474666 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.556358 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.556406 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.556463 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.556483 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.556527 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.556719 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-config\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.556767 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.556823 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p2rw\" (UniqueName: \"kubernetes.io/projected/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-kube-api-access-6p2rw\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.556941 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.557017 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.557141 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.658942 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.659001 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.659046 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.659091 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-config\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.659114 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.659143 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p2rw\" (UniqueName: \"kubernetes.io/projected/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-kube-api-access-6p2rw\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.659198 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.659241 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.659299 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.659362 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.659391 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.662827 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.667575 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.668073 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.668239 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.668621 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.668789 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.670452 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.670585 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.671317 4851 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.671354 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f6a0e30b7a50d8862a9481345085f62592a4f2276fdfe80014a12770adb24140/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.672305 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-config\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.684294 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p2rw\" (UniqueName: \"kubernetes.io/projected/e5c7e1ec-7093-4fa6-acf7-39a2839cfb11-kube-api-access-6p2rw\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.705443 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ceaf344b-7ce7-4dd1-8b64-4e82290b08bc\") pod \"prometheus-metric-storage-0\" (UID: \"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:39 crc kubenswrapper[4851]: I1001 13:44:39.793196 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:44:40 crc kubenswrapper[4851]: I1001 13:44:40.282935 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:44:40 crc kubenswrapper[4851]: I1001 13:44:40.346744 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" path="/var/lib/kubelet/pods/cde88b6a-b10f-4282-9f02-ad48a766a911/volumes" Oct 01 13:44:40 crc kubenswrapper[4851]: I1001 13:44:40.388421 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11","Type":"ContainerStarted","Data":"284023c65282d4555a264e170346fb17d6236173f0e0922c6d29cf06da4fed82"} Oct 01 13:44:41 crc kubenswrapper[4851]: I1001 13:44:41.328576 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:44:41 crc kubenswrapper[4851]: E1001 13:44:41.329084 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:44:41 crc kubenswrapper[4851]: I1001 13:44:41.391180 4851 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="cde88b6a-b10f-4282-9f02-ad48a766a911" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.136:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 13:44:44 crc kubenswrapper[4851]: I1001 13:44:44.422702 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11","Type":"ContainerStarted","Data":"46b17f182636bbe0ecb46fd05ad862fd4465967abfb3eec0849a1f1410370e04"} Oct 01 13:44:52 crc kubenswrapper[4851]: I1001 13:44:52.511868 4851 generic.go:334] "Generic (PLEG): container finished" podID="e5c7e1ec-7093-4fa6-acf7-39a2839cfb11" containerID="46b17f182636bbe0ecb46fd05ad862fd4465967abfb3eec0849a1f1410370e04" exitCode=0 Oct 01 13:44:52 crc kubenswrapper[4851]: I1001 13:44:52.511978 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11","Type":"ContainerDied","Data":"46b17f182636bbe0ecb46fd05ad862fd4465967abfb3eec0849a1f1410370e04"} Oct 01 13:44:53 crc kubenswrapper[4851]: I1001 13:44:53.524917 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11","Type":"ContainerStarted","Data":"c980962aa765b65eb2a8c5c95c50e1c996f1feeff5484e69b7c162a4693e3737"} Oct 01 13:44:55 crc kubenswrapper[4851]: I1001 13:44:55.342721 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:44:55 crc kubenswrapper[4851]: E1001 13:44:55.344159 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:44:56 crc kubenswrapper[4851]: I1001 13:44:56.555571 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11","Type":"ContainerStarted","Data":"3bd0446823def9b0e25686130167c637a51338169f11e48cfc78db146c6a5adb"} Oct 01 13:44:57 crc kubenswrapper[4851]: I1001 13:44:57.565855 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e5c7e1ec-7093-4fa6-acf7-39a2839cfb11","Type":"ContainerStarted","Data":"9f03ddf06b65a467d5807f8e6241e6b40763fcaccda34b7ce23452961c1b97be"} Oct 01 13:44:57 crc kubenswrapper[4851]: I1001 13:44:57.616040 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.616018942 podStartE2EDuration="18.616018942s" podCreationTimestamp="2025-10-01 13:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:44:57.604944266 +0000 UTC m=+3105.950061762" watchObservedRunningTime="2025-10-01 13:44:57.616018942 +0000 UTC m=+3105.961136448" Oct 01 13:44:59 crc kubenswrapper[4851]: I1001 13:44:59.794443 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.195190 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2"] Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.225135 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2"] Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.225275 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.228525 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.231087 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.323300 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d95900f-86f8-43b8-b67c-cc68b42d1713-config-volume\") pod \"collect-profiles-29322105-2gdk2\" (UID: \"5d95900f-86f8-43b8-b67c-cc68b42d1713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.323380 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6l8w\" (UniqueName: \"kubernetes.io/projected/5d95900f-86f8-43b8-b67c-cc68b42d1713-kube-api-access-x6l8w\") pod \"collect-profiles-29322105-2gdk2\" (UID: \"5d95900f-86f8-43b8-b67c-cc68b42d1713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.323431 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d95900f-86f8-43b8-b67c-cc68b42d1713-secret-volume\") pod \"collect-profiles-29322105-2gdk2\" (UID: \"5d95900f-86f8-43b8-b67c-cc68b42d1713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.425770 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d95900f-86f8-43b8-b67c-cc68b42d1713-config-volume\") pod \"collect-profiles-29322105-2gdk2\" (UID: \"5d95900f-86f8-43b8-b67c-cc68b42d1713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.425901 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6l8w\" (UniqueName: \"kubernetes.io/projected/5d95900f-86f8-43b8-b67c-cc68b42d1713-kube-api-access-x6l8w\") pod \"collect-profiles-29322105-2gdk2\" (UID: \"5d95900f-86f8-43b8-b67c-cc68b42d1713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.425958 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d95900f-86f8-43b8-b67c-cc68b42d1713-secret-volume\") pod \"collect-profiles-29322105-2gdk2\" (UID: \"5d95900f-86f8-43b8-b67c-cc68b42d1713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.426742 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d95900f-86f8-43b8-b67c-cc68b42d1713-config-volume\") pod \"collect-profiles-29322105-2gdk2\" (UID: \"5d95900f-86f8-43b8-b67c-cc68b42d1713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.456774 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d95900f-86f8-43b8-b67c-cc68b42d1713-secret-volume\") pod \"collect-profiles-29322105-2gdk2\" (UID: \"5d95900f-86f8-43b8-b67c-cc68b42d1713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.460559 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6l8w\" (UniqueName: \"kubernetes.io/projected/5d95900f-86f8-43b8-b67c-cc68b42d1713-kube-api-access-x6l8w\") pod \"collect-profiles-29322105-2gdk2\" (UID: \"5d95900f-86f8-43b8-b67c-cc68b42d1713\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" Oct 01 13:45:00 crc kubenswrapper[4851]: I1001 13:45:00.552835 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" Oct 01 13:45:01 crc kubenswrapper[4851]: I1001 13:45:01.076662 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2"] Oct 01 13:45:01 crc kubenswrapper[4851]: I1001 13:45:01.606023 4851 generic.go:334] "Generic (PLEG): container finished" podID="5d95900f-86f8-43b8-b67c-cc68b42d1713" containerID="1d193f9e6421e7a5a852991402ebf6f4d94cc4bd2135630cccd33ecb428b9169" exitCode=0 Oct 01 13:45:01 crc kubenswrapper[4851]: I1001 13:45:01.606089 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" event={"ID":"5d95900f-86f8-43b8-b67c-cc68b42d1713","Type":"ContainerDied","Data":"1d193f9e6421e7a5a852991402ebf6f4d94cc4bd2135630cccd33ecb428b9169"} Oct 01 13:45:01 crc kubenswrapper[4851]: I1001 13:45:01.606127 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" event={"ID":"5d95900f-86f8-43b8-b67c-cc68b42d1713","Type":"ContainerStarted","Data":"5f753853288f3988ddee221dc577ab61f3f9510651d5d2bf90056ec370d00a48"} Oct 01 13:45:03 crc kubenswrapper[4851]: I1001 13:45:03.043911 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" Oct 01 13:45:03 crc kubenswrapper[4851]: I1001 13:45:03.080913 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6l8w\" (UniqueName: \"kubernetes.io/projected/5d95900f-86f8-43b8-b67c-cc68b42d1713-kube-api-access-x6l8w\") pod \"5d95900f-86f8-43b8-b67c-cc68b42d1713\" (UID: \"5d95900f-86f8-43b8-b67c-cc68b42d1713\") " Oct 01 13:45:03 crc kubenswrapper[4851]: I1001 13:45:03.081111 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d95900f-86f8-43b8-b67c-cc68b42d1713-secret-volume\") pod \"5d95900f-86f8-43b8-b67c-cc68b42d1713\" (UID: \"5d95900f-86f8-43b8-b67c-cc68b42d1713\") " Oct 01 13:45:03 crc kubenswrapper[4851]: I1001 13:45:03.081232 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d95900f-86f8-43b8-b67c-cc68b42d1713-config-volume\") pod \"5d95900f-86f8-43b8-b67c-cc68b42d1713\" (UID: \"5d95900f-86f8-43b8-b67c-cc68b42d1713\") " Oct 01 13:45:03 crc kubenswrapper[4851]: I1001 13:45:03.083244 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d95900f-86f8-43b8-b67c-cc68b42d1713-config-volume" (OuterVolumeSpecName: "config-volume") pod "5d95900f-86f8-43b8-b67c-cc68b42d1713" (UID: "5d95900f-86f8-43b8-b67c-cc68b42d1713"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:45:03 crc kubenswrapper[4851]: I1001 13:45:03.087002 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d95900f-86f8-43b8-b67c-cc68b42d1713-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5d95900f-86f8-43b8-b67c-cc68b42d1713" (UID: "5d95900f-86f8-43b8-b67c-cc68b42d1713"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:45:03 crc kubenswrapper[4851]: I1001 13:45:03.088094 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d95900f-86f8-43b8-b67c-cc68b42d1713-kube-api-access-x6l8w" (OuterVolumeSpecName: "kube-api-access-x6l8w") pod "5d95900f-86f8-43b8-b67c-cc68b42d1713" (UID: "5d95900f-86f8-43b8-b67c-cc68b42d1713"). InnerVolumeSpecName "kube-api-access-x6l8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:45:03 crc kubenswrapper[4851]: I1001 13:45:03.184084 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6l8w\" (UniqueName: \"kubernetes.io/projected/5d95900f-86f8-43b8-b67c-cc68b42d1713-kube-api-access-x6l8w\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:03 crc kubenswrapper[4851]: I1001 13:45:03.184117 4851 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d95900f-86f8-43b8-b67c-cc68b42d1713-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:03 crc kubenswrapper[4851]: I1001 13:45:03.184128 4851 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d95900f-86f8-43b8-b67c-cc68b42d1713-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:03 crc kubenswrapper[4851]: I1001 13:45:03.633184 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" event={"ID":"5d95900f-86f8-43b8-b67c-cc68b42d1713","Type":"ContainerDied","Data":"5f753853288f3988ddee221dc577ab61f3f9510651d5d2bf90056ec370d00a48"} Oct 01 13:45:03 crc kubenswrapper[4851]: I1001 13:45:03.633234 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f753853288f3988ddee221dc577ab61f3f9510651d5d2bf90056ec370d00a48" Oct 01 13:45:03 crc kubenswrapper[4851]: I1001 13:45:03.633292 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2" Oct 01 13:45:04 crc kubenswrapper[4851]: I1001 13:45:04.126171 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r"] Oct 01 13:45:04 crc kubenswrapper[4851]: I1001 13:45:04.135340 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-nt85r"] Oct 01 13:45:04 crc kubenswrapper[4851]: I1001 13:45:04.371597 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee89af1b-9cae-40d0-a6a4-555b63d3e419" path="/var/lib/kubelet/pods/ee89af1b-9cae-40d0-a6a4-555b63d3e419/volumes" Oct 01 13:45:09 crc kubenswrapper[4851]: I1001 13:45:09.330277 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:45:09 crc kubenswrapper[4851]: E1001 13:45:09.330926 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:45:09 crc kubenswrapper[4851]: I1001 13:45:09.794391 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 01 13:45:09 crc kubenswrapper[4851]: I1001 13:45:09.803684 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 01 13:45:10 crc kubenswrapper[4851]: I1001 13:45:10.717134 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 01 13:45:17 crc kubenswrapper[4851]: I1001 13:45:17.813594 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-64tq6"] Oct 01 13:45:17 crc kubenswrapper[4851]: E1001 13:45:17.814708 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d95900f-86f8-43b8-b67c-cc68b42d1713" containerName="collect-profiles" Oct 01 13:45:17 crc kubenswrapper[4851]: I1001 13:45:17.814727 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d95900f-86f8-43b8-b67c-cc68b42d1713" containerName="collect-profiles" Oct 01 13:45:17 crc kubenswrapper[4851]: I1001 13:45:17.815015 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d95900f-86f8-43b8-b67c-cc68b42d1713" containerName="collect-profiles" Oct 01 13:45:17 crc kubenswrapper[4851]: I1001 13:45:17.816804 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:17 crc kubenswrapper[4851]: I1001 13:45:17.829627 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8973667-7248-4af8-9b41-862cf2672daf-utilities\") pod \"community-operators-64tq6\" (UID: \"e8973667-7248-4af8-9b41-862cf2672daf\") " pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:17 crc kubenswrapper[4851]: I1001 13:45:17.829754 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vzk4\" (UniqueName: \"kubernetes.io/projected/e8973667-7248-4af8-9b41-862cf2672daf-kube-api-access-8vzk4\") pod \"community-operators-64tq6\" (UID: \"e8973667-7248-4af8-9b41-862cf2672daf\") " pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:17 crc kubenswrapper[4851]: I1001 13:45:17.829851 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8973667-7248-4af8-9b41-862cf2672daf-catalog-content\") pod \"community-operators-64tq6\" (UID: \"e8973667-7248-4af8-9b41-862cf2672daf\") " pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:17 crc kubenswrapper[4851]: I1001 13:45:17.841326 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-64tq6"] Oct 01 13:45:17 crc kubenswrapper[4851]: I1001 13:45:17.932239 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vzk4\" (UniqueName: \"kubernetes.io/projected/e8973667-7248-4af8-9b41-862cf2672daf-kube-api-access-8vzk4\") pod \"community-operators-64tq6\" (UID: \"e8973667-7248-4af8-9b41-862cf2672daf\") " pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:17 crc kubenswrapper[4851]: I1001 13:45:17.932330 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8973667-7248-4af8-9b41-862cf2672daf-catalog-content\") pod \"community-operators-64tq6\" (UID: \"e8973667-7248-4af8-9b41-862cf2672daf\") " pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:17 crc kubenswrapper[4851]: I1001 13:45:17.932530 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8973667-7248-4af8-9b41-862cf2672daf-utilities\") pod \"community-operators-64tq6\" (UID: \"e8973667-7248-4af8-9b41-862cf2672daf\") " pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:17 crc kubenswrapper[4851]: I1001 13:45:17.932829 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8973667-7248-4af8-9b41-862cf2672daf-catalog-content\") pod \"community-operators-64tq6\" (UID: \"e8973667-7248-4af8-9b41-862cf2672daf\") " pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:17 crc kubenswrapper[4851]: I1001 13:45:17.933011 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8973667-7248-4af8-9b41-862cf2672daf-utilities\") pod \"community-operators-64tq6\" (UID: \"e8973667-7248-4af8-9b41-862cf2672daf\") " pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:17 crc kubenswrapper[4851]: I1001 13:45:17.950927 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vzk4\" (UniqueName: \"kubernetes.io/projected/e8973667-7248-4af8-9b41-862cf2672daf-kube-api-access-8vzk4\") pod \"community-operators-64tq6\" (UID: \"e8973667-7248-4af8-9b41-862cf2672daf\") " pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:18 crc kubenswrapper[4851]: I1001 13:45:18.138664 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:18 crc kubenswrapper[4851]: I1001 13:45:18.628705 4851 scope.go:117] "RemoveContainer" containerID="caf8b1977df89ba68e1cbf9d955d42f5ae1587d0eaf188335c919e81164ed6ea" Oct 01 13:45:18 crc kubenswrapper[4851]: I1001 13:45:18.645246 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-64tq6"] Oct 01 13:45:18 crc kubenswrapper[4851]: I1001 13:45:18.653101 4851 scope.go:117] "RemoveContainer" containerID="738d102c484953c3278b0eb697d0b8cb456b39fa78b02bc54fc0e106d151a536" Oct 01 13:45:18 crc kubenswrapper[4851]: W1001 13:45:18.664614 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8973667_7248_4af8_9b41_862cf2672daf.slice/crio-5c72bd4f88e991fb3528abf831c39deeb3dc1069e32884a299f4d4b42bc2eaf3 WatchSource:0}: Error finding container 5c72bd4f88e991fb3528abf831c39deeb3dc1069e32884a299f4d4b42bc2eaf3: Status 404 returned error can't find the container with id 5c72bd4f88e991fb3528abf831c39deeb3dc1069e32884a299f4d4b42bc2eaf3 Oct 01 13:45:18 crc kubenswrapper[4851]: I1001 13:45:18.720959 4851 scope.go:117] "RemoveContainer" containerID="be33d65d4e1fa8a9798a0fbebead9be31a2ceabe441b0fbc11a7cf73b6d3bae9" Oct 01 13:45:18 crc kubenswrapper[4851]: I1001 13:45:18.762730 4851 scope.go:117] "RemoveContainer" containerID="f5f5da7e72201369b37ad67af4f52722232c57eca697c7135aebd305ad5b96f4" Oct 01 13:45:18 crc kubenswrapper[4851]: I1001 13:45:18.814173 4851 scope.go:117] "RemoveContainer" containerID="b2a7aa1574ac7509dc8dc51b4fd2502f4a8130f2c3f33baa41b5fb2f9152991d" Oct 01 13:45:18 crc kubenswrapper[4851]: I1001 13:45:18.816230 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64tq6" event={"ID":"e8973667-7248-4af8-9b41-862cf2672daf","Type":"ContainerStarted","Data":"5c72bd4f88e991fb3528abf831c39deeb3dc1069e32884a299f4d4b42bc2eaf3"} Oct 01 13:45:19 crc kubenswrapper[4851]: I1001 13:45:19.826398 4851 generic.go:334] "Generic (PLEG): container finished" podID="e8973667-7248-4af8-9b41-862cf2672daf" containerID="57362c0c1e63d853fc38cc7324b1f23c82488c35aa32157b0a41e97f495bdaf3" exitCode=0 Oct 01 13:45:19 crc kubenswrapper[4851]: I1001 13:45:19.826797 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64tq6" event={"ID":"e8973667-7248-4af8-9b41-862cf2672daf","Type":"ContainerDied","Data":"57362c0c1e63d853fc38cc7324b1f23c82488c35aa32157b0a41e97f495bdaf3"} Oct 01 13:45:21 crc kubenswrapper[4851]: I1001 13:45:21.852816 4851 generic.go:334] "Generic (PLEG): container finished" podID="e8973667-7248-4af8-9b41-862cf2672daf" containerID="358c888f687ce7bb71c480c0c19008c323a4ba7a90891a0e9086a633fa7dc49c" exitCode=0 Oct 01 13:45:21 crc kubenswrapper[4851]: I1001 13:45:21.852958 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64tq6" event={"ID":"e8973667-7248-4af8-9b41-862cf2672daf","Type":"ContainerDied","Data":"358c888f687ce7bb71c480c0c19008c323a4ba7a90891a0e9086a633fa7dc49c"} Oct 01 13:45:22 crc kubenswrapper[4851]: I1001 13:45:22.336147 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:45:22 crc kubenswrapper[4851]: E1001 13:45:22.336442 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:45:23 crc kubenswrapper[4851]: I1001 13:45:23.874967 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64tq6" event={"ID":"e8973667-7248-4af8-9b41-862cf2672daf","Type":"ContainerStarted","Data":"f9465ef93d3fee1276a85b664edbca5ef0f1da11668ecd67d8db5c5f37b7203a"} Oct 01 13:45:23 crc kubenswrapper[4851]: I1001 13:45:23.898922 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-64tq6" podStartSLOduration=4.141310986 podStartE2EDuration="6.898889253s" podCreationTimestamp="2025-10-01 13:45:17 +0000 UTC" firstStartedPulling="2025-10-01 13:45:19.828736532 +0000 UTC m=+3128.173854018" lastFinishedPulling="2025-10-01 13:45:22.586314769 +0000 UTC m=+3130.931432285" observedRunningTime="2025-10-01 13:45:23.898698178 +0000 UTC m=+3132.243815694" watchObservedRunningTime="2025-10-01 13:45:23.898889253 +0000 UTC m=+3132.244006779" Oct 01 13:45:28 crc kubenswrapper[4851]: I1001 13:45:28.139561 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:28 crc kubenswrapper[4851]: I1001 13:45:28.141089 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:28 crc kubenswrapper[4851]: I1001 13:45:28.185689 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:28 crc kubenswrapper[4851]: I1001 13:45:28.996942 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:29 crc kubenswrapper[4851]: I1001 13:45:29.110585 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-64tq6"] Oct 01 13:45:30 crc kubenswrapper[4851]: I1001 13:45:30.946289 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-64tq6" podUID="e8973667-7248-4af8-9b41-862cf2672daf" containerName="registry-server" containerID="cri-o://f9465ef93d3fee1276a85b664edbca5ef0f1da11668ecd67d8db5c5f37b7203a" gracePeriod=2 Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.523717 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.545847 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8973667-7248-4af8-9b41-862cf2672daf-utilities\") pod \"e8973667-7248-4af8-9b41-862cf2672daf\" (UID: \"e8973667-7248-4af8-9b41-862cf2672daf\") " Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.546374 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vzk4\" (UniqueName: \"kubernetes.io/projected/e8973667-7248-4af8-9b41-862cf2672daf-kube-api-access-8vzk4\") pod \"e8973667-7248-4af8-9b41-862cf2672daf\" (UID: \"e8973667-7248-4af8-9b41-862cf2672daf\") " Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.546728 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8973667-7248-4af8-9b41-862cf2672daf-catalog-content\") pod \"e8973667-7248-4af8-9b41-862cf2672daf\" (UID: \"e8973667-7248-4af8-9b41-862cf2672daf\") " Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.546767 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8973667-7248-4af8-9b41-862cf2672daf-utilities" (OuterVolumeSpecName: "utilities") pod "e8973667-7248-4af8-9b41-862cf2672daf" (UID: "e8973667-7248-4af8-9b41-862cf2672daf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.550549 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8973667-7248-4af8-9b41-862cf2672daf-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.577905 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8973667-7248-4af8-9b41-862cf2672daf-kube-api-access-8vzk4" (OuterVolumeSpecName: "kube-api-access-8vzk4") pod "e8973667-7248-4af8-9b41-862cf2672daf" (UID: "e8973667-7248-4af8-9b41-862cf2672daf"). InnerVolumeSpecName "kube-api-access-8vzk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.654043 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vzk4\" (UniqueName: \"kubernetes.io/projected/e8973667-7248-4af8-9b41-862cf2672daf-kube-api-access-8vzk4\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.790056 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8973667-7248-4af8-9b41-862cf2672daf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8973667-7248-4af8-9b41-862cf2672daf" (UID: "e8973667-7248-4af8-9b41-862cf2672daf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.857480 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8973667-7248-4af8-9b41-862cf2672daf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.961341 4851 generic.go:334] "Generic (PLEG): container finished" podID="e8973667-7248-4af8-9b41-862cf2672daf" containerID="f9465ef93d3fee1276a85b664edbca5ef0f1da11668ecd67d8db5c5f37b7203a" exitCode=0 Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.961362 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64tq6" Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.961381 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64tq6" event={"ID":"e8973667-7248-4af8-9b41-862cf2672daf","Type":"ContainerDied","Data":"f9465ef93d3fee1276a85b664edbca5ef0f1da11668ecd67d8db5c5f37b7203a"} Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.962826 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64tq6" event={"ID":"e8973667-7248-4af8-9b41-862cf2672daf","Type":"ContainerDied","Data":"5c72bd4f88e991fb3528abf831c39deeb3dc1069e32884a299f4d4b42bc2eaf3"} Oct 01 13:45:31 crc kubenswrapper[4851]: I1001 13:45:31.962877 4851 scope.go:117] "RemoveContainer" containerID="f9465ef93d3fee1276a85b664edbca5ef0f1da11668ecd67d8db5c5f37b7203a" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.014734 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-64tq6"] Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.015701 4851 scope.go:117] "RemoveContainer" containerID="358c888f687ce7bb71c480c0c19008c323a4ba7a90891a0e9086a633fa7dc49c" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.035769 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-64tq6"] Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.053806 4851 scope.go:117] "RemoveContainer" containerID="57362c0c1e63d853fc38cc7324b1f23c82488c35aa32157b0a41e97f495bdaf3" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.118632 4851 scope.go:117] "RemoveContainer" containerID="f9465ef93d3fee1276a85b664edbca5ef0f1da11668ecd67d8db5c5f37b7203a" Oct 01 13:45:32 crc kubenswrapper[4851]: E1001 13:45:32.119080 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9465ef93d3fee1276a85b664edbca5ef0f1da11668ecd67d8db5c5f37b7203a\": container with ID starting with f9465ef93d3fee1276a85b664edbca5ef0f1da11668ecd67d8db5c5f37b7203a not found: ID does not exist" containerID="f9465ef93d3fee1276a85b664edbca5ef0f1da11668ecd67d8db5c5f37b7203a" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.119112 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9465ef93d3fee1276a85b664edbca5ef0f1da11668ecd67d8db5c5f37b7203a"} err="failed to get container status \"f9465ef93d3fee1276a85b664edbca5ef0f1da11668ecd67d8db5c5f37b7203a\": rpc error: code = NotFound desc = could not find container \"f9465ef93d3fee1276a85b664edbca5ef0f1da11668ecd67d8db5c5f37b7203a\": container with ID starting with f9465ef93d3fee1276a85b664edbca5ef0f1da11668ecd67d8db5c5f37b7203a not found: ID does not exist" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.119131 4851 scope.go:117] "RemoveContainer" containerID="358c888f687ce7bb71c480c0c19008c323a4ba7a90891a0e9086a633fa7dc49c" Oct 01 13:45:32 crc kubenswrapper[4851]: E1001 13:45:32.119458 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"358c888f687ce7bb71c480c0c19008c323a4ba7a90891a0e9086a633fa7dc49c\": container with ID starting with 358c888f687ce7bb71c480c0c19008c323a4ba7a90891a0e9086a633fa7dc49c not found: ID does not exist" containerID="358c888f687ce7bb71c480c0c19008c323a4ba7a90891a0e9086a633fa7dc49c" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.119532 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"358c888f687ce7bb71c480c0c19008c323a4ba7a90891a0e9086a633fa7dc49c"} err="failed to get container status \"358c888f687ce7bb71c480c0c19008c323a4ba7a90891a0e9086a633fa7dc49c\": rpc error: code = NotFound desc = could not find container \"358c888f687ce7bb71c480c0c19008c323a4ba7a90891a0e9086a633fa7dc49c\": container with ID starting with 358c888f687ce7bb71c480c0c19008c323a4ba7a90891a0e9086a633fa7dc49c not found: ID does not exist" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.119578 4851 scope.go:117] "RemoveContainer" containerID="57362c0c1e63d853fc38cc7324b1f23c82488c35aa32157b0a41e97f495bdaf3" Oct 01 13:45:32 crc kubenswrapper[4851]: E1001 13:45:32.119891 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57362c0c1e63d853fc38cc7324b1f23c82488c35aa32157b0a41e97f495bdaf3\": container with ID starting with 57362c0c1e63d853fc38cc7324b1f23c82488c35aa32157b0a41e97f495bdaf3 not found: ID does not exist" containerID="57362c0c1e63d853fc38cc7324b1f23c82488c35aa32157b0a41e97f495bdaf3" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.119918 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57362c0c1e63d853fc38cc7324b1f23c82488c35aa32157b0a41e97f495bdaf3"} err="failed to get container status \"57362c0c1e63d853fc38cc7324b1f23c82488c35aa32157b0a41e97f495bdaf3\": rpc error: code = NotFound desc = could not find container \"57362c0c1e63d853fc38cc7324b1f23c82488c35aa32157b0a41e97f495bdaf3\": container with ID starting with 57362c0c1e63d853fc38cc7324b1f23c82488c35aa32157b0a41e97f495bdaf3 not found: ID does not exist" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.351745 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8973667-7248-4af8-9b41-862cf2672daf" path="/var/lib/kubelet/pods/e8973667-7248-4af8-9b41-862cf2672daf/volumes" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.847710 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 13:45:32 crc kubenswrapper[4851]: E1001 13:45:32.848965 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8973667-7248-4af8-9b41-862cf2672daf" containerName="extract-content" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.849008 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8973667-7248-4af8-9b41-862cf2672daf" containerName="extract-content" Oct 01 13:45:32 crc kubenswrapper[4851]: E1001 13:45:32.849095 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8973667-7248-4af8-9b41-862cf2672daf" containerName="registry-server" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.849115 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8973667-7248-4af8-9b41-862cf2672daf" containerName="registry-server" Oct 01 13:45:32 crc kubenswrapper[4851]: E1001 13:45:32.849170 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8973667-7248-4af8-9b41-862cf2672daf" containerName="extract-utilities" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.849190 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8973667-7248-4af8-9b41-862cf2672daf" containerName="extract-utilities" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.849761 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8973667-7248-4af8-9b41-862cf2672daf" containerName="registry-server" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.851309 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.855479 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.856120 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-wqpgn" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.856494 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.857094 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.871427 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.877275 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.877382 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.877429 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/996ff379-292e-4a71-a09b-164fc21abe76-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.877555 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.877637 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/996ff379-292e-4a71-a09b-164fc21abe76-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.877756 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/996ff379-292e-4a71-a09b-164fc21abe76-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.877855 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/996ff379-292e-4a71-a09b-164fc21abe76-config-data\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.877932 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7x6r\" (UniqueName: \"kubernetes.io/projected/996ff379-292e-4a71-a09b-164fc21abe76-kube-api-access-f7x6r\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.877966 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.982212 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.982263 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/996ff379-292e-4a71-a09b-164fc21abe76-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.982391 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.982489 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/996ff379-292e-4a71-a09b-164fc21abe76-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.982557 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/996ff379-292e-4a71-a09b-164fc21abe76-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.983047 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/996ff379-292e-4a71-a09b-164fc21abe76-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.983089 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/996ff379-292e-4a71-a09b-164fc21abe76-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.983135 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/996ff379-292e-4a71-a09b-164fc21abe76-config-data\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.983181 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7x6r\" (UniqueName: \"kubernetes.io/projected/996ff379-292e-4a71-a09b-164fc21abe76-kube-api-access-f7x6r\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.983199 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.983774 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.984134 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/996ff379-292e-4a71-a09b-164fc21abe76-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.984531 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/996ff379-292e-4a71-a09b-164fc21abe76-config-data\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.984808 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.991346 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.991693 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:32 crc kubenswrapper[4851]: I1001 13:45:32.995313 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:33 crc kubenswrapper[4851]: I1001 13:45:33.004943 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7x6r\" (UniqueName: \"kubernetes.io/projected/996ff379-292e-4a71-a09b-164fc21abe76-kube-api-access-f7x6r\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:33 crc kubenswrapper[4851]: I1001 13:45:33.026432 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " pod="openstack/tempest-tests-tempest" Oct 01 13:45:33 crc kubenswrapper[4851]: I1001 13:45:33.175624 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 13:45:33 crc kubenswrapper[4851]: I1001 13:45:33.617468 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 13:45:33 crc kubenswrapper[4851]: I1001 13:45:33.986617 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"996ff379-292e-4a71-a09b-164fc21abe76","Type":"ContainerStarted","Data":"f2a17b43e5559a79bbc43654e356c244a74fa6c4c363203ec6011af30af38f5f"} Oct 01 13:45:34 crc kubenswrapper[4851]: I1001 13:45:34.329091 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:45:34 crc kubenswrapper[4851]: E1001 13:45:34.329439 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:45:47 crc kubenswrapper[4851]: I1001 13:45:47.099813 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"996ff379-292e-4a71-a09b-164fc21abe76","Type":"ContainerStarted","Data":"5b4251985f79c2018bf83d1cebb7382dd71c6ec474bbce4b04d12541a2e3d24b"} Oct 01 13:45:48 crc kubenswrapper[4851]: I1001 13:45:48.329324 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:45:48 crc kubenswrapper[4851]: E1001 13:45:48.329937 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:46:02 crc kubenswrapper[4851]: I1001 13:46:02.329530 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:46:02 crc kubenswrapper[4851]: E1001 13:46:02.330738 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:46:15 crc kubenswrapper[4851]: I1001 13:46:15.328417 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:46:15 crc kubenswrapper[4851]: E1001 13:46:15.330034 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:46:27 crc kubenswrapper[4851]: I1001 13:46:27.329294 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:46:27 crc kubenswrapper[4851]: E1001 13:46:27.330777 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:46:41 crc kubenswrapper[4851]: I1001 13:46:41.329526 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:46:41 crc kubenswrapper[4851]: E1001 13:46:41.330873 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:46:55 crc kubenswrapper[4851]: I1001 13:46:55.329455 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:46:55 crc kubenswrapper[4851]: E1001 13:46:55.330324 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:47:08 crc kubenswrapper[4851]: I1001 13:47:08.329330 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:47:08 crc kubenswrapper[4851]: E1001 13:47:08.331086 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:47:19 crc kubenswrapper[4851]: I1001 13:47:19.329016 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:47:19 crc kubenswrapper[4851]: E1001 13:47:19.329974 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:47:32 crc kubenswrapper[4851]: I1001 13:47:32.351575 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:47:32 crc kubenswrapper[4851]: E1001 13:47:32.353069 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:47:43 crc kubenswrapper[4851]: I1001 13:47:43.328892 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:47:43 crc kubenswrapper[4851]: E1001 13:47:43.329869 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:47:55 crc kubenswrapper[4851]: I1001 13:47:55.328263 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:47:55 crc kubenswrapper[4851]: E1001 13:47:55.329223 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:48:10 crc kubenswrapper[4851]: I1001 13:48:10.328569 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:48:10 crc kubenswrapper[4851]: I1001 13:48:10.908085 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"bccfcb5cc69485d92b9238f696337dd72fc872b5406e4835ae7fbd5bfadb5238"} Oct 01 13:48:10 crc kubenswrapper[4851]: I1001 13:48:10.935730 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=148.0405449 podStartE2EDuration="2m39.935695903s" podCreationTimestamp="2025-10-01 13:45:31 +0000 UTC" firstStartedPulling="2025-10-01 13:45:33.621653609 +0000 UTC m=+3141.966771105" lastFinishedPulling="2025-10-01 13:45:45.516804612 +0000 UTC m=+3153.861922108" observedRunningTime="2025-10-01 13:45:47.123398316 +0000 UTC m=+3155.468515822" watchObservedRunningTime="2025-10-01 13:48:10.935695903 +0000 UTC m=+3299.280813429" Oct 01 13:48:31 crc kubenswrapper[4851]: I1001 13:48:31.993439 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n9hf8"] Oct 01 13:48:32 crc kubenswrapper[4851]: I1001 13:48:31.998785 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:32 crc kubenswrapper[4851]: I1001 13:48:32.053091 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n9hf8"] Oct 01 13:48:32 crc kubenswrapper[4851]: I1001 13:48:32.101119 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440f37e6-2b14-40af-8f84-b0576d502166-catalog-content\") pod \"redhat-marketplace-n9hf8\" (UID: \"440f37e6-2b14-40af-8f84-b0576d502166\") " pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:32 crc kubenswrapper[4851]: I1001 13:48:32.101204 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnf89\" (UniqueName: \"kubernetes.io/projected/440f37e6-2b14-40af-8f84-b0576d502166-kube-api-access-dnf89\") pod \"redhat-marketplace-n9hf8\" (UID: \"440f37e6-2b14-40af-8f84-b0576d502166\") " pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:32 crc kubenswrapper[4851]: I1001 13:48:32.101297 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440f37e6-2b14-40af-8f84-b0576d502166-utilities\") pod \"redhat-marketplace-n9hf8\" (UID: \"440f37e6-2b14-40af-8f84-b0576d502166\") " pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:32 crc kubenswrapper[4851]: I1001 13:48:32.204242 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnf89\" (UniqueName: \"kubernetes.io/projected/440f37e6-2b14-40af-8f84-b0576d502166-kube-api-access-dnf89\") pod \"redhat-marketplace-n9hf8\" (UID: \"440f37e6-2b14-40af-8f84-b0576d502166\") " pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:32 crc kubenswrapper[4851]: I1001 13:48:32.204414 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440f37e6-2b14-40af-8f84-b0576d502166-utilities\") pod \"redhat-marketplace-n9hf8\" (UID: \"440f37e6-2b14-40af-8f84-b0576d502166\") " pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:32 crc kubenswrapper[4851]: I1001 13:48:32.204616 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440f37e6-2b14-40af-8f84-b0576d502166-catalog-content\") pod \"redhat-marketplace-n9hf8\" (UID: \"440f37e6-2b14-40af-8f84-b0576d502166\") " pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:32 crc kubenswrapper[4851]: I1001 13:48:32.205278 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440f37e6-2b14-40af-8f84-b0576d502166-catalog-content\") pod \"redhat-marketplace-n9hf8\" (UID: \"440f37e6-2b14-40af-8f84-b0576d502166\") " pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:32 crc kubenswrapper[4851]: I1001 13:48:32.205938 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440f37e6-2b14-40af-8f84-b0576d502166-utilities\") pod \"redhat-marketplace-n9hf8\" (UID: \"440f37e6-2b14-40af-8f84-b0576d502166\") " pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:32 crc kubenswrapper[4851]: I1001 13:48:32.226005 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnf89\" (UniqueName: \"kubernetes.io/projected/440f37e6-2b14-40af-8f84-b0576d502166-kube-api-access-dnf89\") pod \"redhat-marketplace-n9hf8\" (UID: \"440f37e6-2b14-40af-8f84-b0576d502166\") " pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:32 crc kubenswrapper[4851]: I1001 13:48:32.374043 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:33 crc kubenswrapper[4851]: I1001 13:48:32.856650 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n9hf8"] Oct 01 13:48:33 crc kubenswrapper[4851]: W1001 13:48:32.865238 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod440f37e6_2b14_40af_8f84_b0576d502166.slice/crio-073ab66e671a7619ddb1cd4d785cbc45d6354c4f9887267cbbbc7ae11bf5b1f3 WatchSource:0}: Error finding container 073ab66e671a7619ddb1cd4d785cbc45d6354c4f9887267cbbbc7ae11bf5b1f3: Status 404 returned error can't find the container with id 073ab66e671a7619ddb1cd4d785cbc45d6354c4f9887267cbbbc7ae11bf5b1f3 Oct 01 13:48:33 crc kubenswrapper[4851]: I1001 13:48:33.177754 4851 generic.go:334] "Generic (PLEG): container finished" podID="440f37e6-2b14-40af-8f84-b0576d502166" containerID="e2fb607e16fe5b4dc5e62ad07aae56dcb1c89c67dcd94121f818def433bab36b" exitCode=0 Oct 01 13:48:33 crc kubenswrapper[4851]: I1001 13:48:33.177799 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n9hf8" event={"ID":"440f37e6-2b14-40af-8f84-b0576d502166","Type":"ContainerDied","Data":"e2fb607e16fe5b4dc5e62ad07aae56dcb1c89c67dcd94121f818def433bab36b"} Oct 01 13:48:33 crc kubenswrapper[4851]: I1001 13:48:33.177828 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n9hf8" event={"ID":"440f37e6-2b14-40af-8f84-b0576d502166","Type":"ContainerStarted","Data":"073ab66e671a7619ddb1cd4d785cbc45d6354c4f9887267cbbbc7ae11bf5b1f3"} Oct 01 13:48:33 crc kubenswrapper[4851]: I1001 13:48:33.180314 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:48:35 crc kubenswrapper[4851]: I1001 13:48:35.214759 4851 generic.go:334] "Generic (PLEG): container finished" podID="440f37e6-2b14-40af-8f84-b0576d502166" containerID="21de7b0232eb4d58718472378284ff813cc80f67edaa9fb520683fbdeb3b4816" exitCode=0 Oct 01 13:48:35 crc kubenswrapper[4851]: I1001 13:48:35.214834 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n9hf8" event={"ID":"440f37e6-2b14-40af-8f84-b0576d502166","Type":"ContainerDied","Data":"21de7b0232eb4d58718472378284ff813cc80f67edaa9fb520683fbdeb3b4816"} Oct 01 13:48:37 crc kubenswrapper[4851]: I1001 13:48:37.244441 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n9hf8" event={"ID":"440f37e6-2b14-40af-8f84-b0576d502166","Type":"ContainerStarted","Data":"3bca04ed17ceda9179456d5542f31da44e6fa4d10873891ebdba3de4129d9cb6"} Oct 01 13:48:37 crc kubenswrapper[4851]: I1001 13:48:37.274464 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n9hf8" podStartSLOduration=2.9361295739999997 podStartE2EDuration="6.274437841s" podCreationTimestamp="2025-10-01 13:48:31 +0000 UTC" firstStartedPulling="2025-10-01 13:48:33.180041309 +0000 UTC m=+3321.525158795" lastFinishedPulling="2025-10-01 13:48:36.518349546 +0000 UTC m=+3324.863467062" observedRunningTime="2025-10-01 13:48:37.266464364 +0000 UTC m=+3325.611581890" watchObservedRunningTime="2025-10-01 13:48:37.274437841 +0000 UTC m=+3325.619555367" Oct 01 13:48:42 crc kubenswrapper[4851]: I1001 13:48:42.374939 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:42 crc kubenswrapper[4851]: I1001 13:48:42.376818 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:42 crc kubenswrapper[4851]: I1001 13:48:42.465440 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:43 crc kubenswrapper[4851]: I1001 13:48:43.387783 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:43 crc kubenswrapper[4851]: I1001 13:48:43.461695 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n9hf8"] Oct 01 13:48:45 crc kubenswrapper[4851]: I1001 13:48:45.348694 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n9hf8" podUID="440f37e6-2b14-40af-8f84-b0576d502166" containerName="registry-server" containerID="cri-o://3bca04ed17ceda9179456d5542f31da44e6fa4d10873891ebdba3de4129d9cb6" gracePeriod=2 Oct 01 13:48:45 crc kubenswrapper[4851]: I1001 13:48:45.914920 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.042586 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440f37e6-2b14-40af-8f84-b0576d502166-catalog-content\") pod \"440f37e6-2b14-40af-8f84-b0576d502166\" (UID: \"440f37e6-2b14-40af-8f84-b0576d502166\") " Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.042922 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnf89\" (UniqueName: \"kubernetes.io/projected/440f37e6-2b14-40af-8f84-b0576d502166-kube-api-access-dnf89\") pod \"440f37e6-2b14-40af-8f84-b0576d502166\" (UID: \"440f37e6-2b14-40af-8f84-b0576d502166\") " Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.043272 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440f37e6-2b14-40af-8f84-b0576d502166-utilities\") pod \"440f37e6-2b14-40af-8f84-b0576d502166\" (UID: \"440f37e6-2b14-40af-8f84-b0576d502166\") " Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.044249 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/440f37e6-2b14-40af-8f84-b0576d502166-utilities" (OuterVolumeSpecName: "utilities") pod "440f37e6-2b14-40af-8f84-b0576d502166" (UID: "440f37e6-2b14-40af-8f84-b0576d502166"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.044596 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440f37e6-2b14-40af-8f84-b0576d502166-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.051812 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/440f37e6-2b14-40af-8f84-b0576d502166-kube-api-access-dnf89" (OuterVolumeSpecName: "kube-api-access-dnf89") pod "440f37e6-2b14-40af-8f84-b0576d502166" (UID: "440f37e6-2b14-40af-8f84-b0576d502166"). InnerVolumeSpecName "kube-api-access-dnf89". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.054453 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/440f37e6-2b14-40af-8f84-b0576d502166-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "440f37e6-2b14-40af-8f84-b0576d502166" (UID: "440f37e6-2b14-40af-8f84-b0576d502166"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.146288 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440f37e6-2b14-40af-8f84-b0576d502166-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.146326 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnf89\" (UniqueName: \"kubernetes.io/projected/440f37e6-2b14-40af-8f84-b0576d502166-kube-api-access-dnf89\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.363967 4851 generic.go:334] "Generic (PLEG): container finished" podID="440f37e6-2b14-40af-8f84-b0576d502166" containerID="3bca04ed17ceda9179456d5542f31da44e6fa4d10873891ebdba3de4129d9cb6" exitCode=0 Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.364035 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n9hf8" event={"ID":"440f37e6-2b14-40af-8f84-b0576d502166","Type":"ContainerDied","Data":"3bca04ed17ceda9179456d5542f31da44e6fa4d10873891ebdba3de4129d9cb6"} Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.365078 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n9hf8" event={"ID":"440f37e6-2b14-40af-8f84-b0576d502166","Type":"ContainerDied","Data":"073ab66e671a7619ddb1cd4d785cbc45d6354c4f9887267cbbbc7ae11bf5b1f3"} Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.364087 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n9hf8" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.365159 4851 scope.go:117] "RemoveContainer" containerID="3bca04ed17ceda9179456d5542f31da44e6fa4d10873891ebdba3de4129d9cb6" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.396558 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n9hf8"] Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.404840 4851 scope.go:117] "RemoveContainer" containerID="21de7b0232eb4d58718472378284ff813cc80f67edaa9fb520683fbdeb3b4816" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.413201 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n9hf8"] Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.436136 4851 scope.go:117] "RemoveContainer" containerID="e2fb607e16fe5b4dc5e62ad07aae56dcb1c89c67dcd94121f818def433bab36b" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.510117 4851 scope.go:117] "RemoveContainer" containerID="3bca04ed17ceda9179456d5542f31da44e6fa4d10873891ebdba3de4129d9cb6" Oct 01 13:48:46 crc kubenswrapper[4851]: E1001 13:48:46.510891 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bca04ed17ceda9179456d5542f31da44e6fa4d10873891ebdba3de4129d9cb6\": container with ID starting with 3bca04ed17ceda9179456d5542f31da44e6fa4d10873891ebdba3de4129d9cb6 not found: ID does not exist" containerID="3bca04ed17ceda9179456d5542f31da44e6fa4d10873891ebdba3de4129d9cb6" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.510960 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bca04ed17ceda9179456d5542f31da44e6fa4d10873891ebdba3de4129d9cb6"} err="failed to get container status \"3bca04ed17ceda9179456d5542f31da44e6fa4d10873891ebdba3de4129d9cb6\": rpc error: code = NotFound desc = could not find container \"3bca04ed17ceda9179456d5542f31da44e6fa4d10873891ebdba3de4129d9cb6\": container with ID starting with 3bca04ed17ceda9179456d5542f31da44e6fa4d10873891ebdba3de4129d9cb6 not found: ID does not exist" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.511004 4851 scope.go:117] "RemoveContainer" containerID="21de7b0232eb4d58718472378284ff813cc80f67edaa9fb520683fbdeb3b4816" Oct 01 13:48:46 crc kubenswrapper[4851]: E1001 13:48:46.511538 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21de7b0232eb4d58718472378284ff813cc80f67edaa9fb520683fbdeb3b4816\": container with ID starting with 21de7b0232eb4d58718472378284ff813cc80f67edaa9fb520683fbdeb3b4816 not found: ID does not exist" containerID="21de7b0232eb4d58718472378284ff813cc80f67edaa9fb520683fbdeb3b4816" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.511580 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21de7b0232eb4d58718472378284ff813cc80f67edaa9fb520683fbdeb3b4816"} err="failed to get container status \"21de7b0232eb4d58718472378284ff813cc80f67edaa9fb520683fbdeb3b4816\": rpc error: code = NotFound desc = could not find container \"21de7b0232eb4d58718472378284ff813cc80f67edaa9fb520683fbdeb3b4816\": container with ID starting with 21de7b0232eb4d58718472378284ff813cc80f67edaa9fb520683fbdeb3b4816 not found: ID does not exist" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.511607 4851 scope.go:117] "RemoveContainer" containerID="e2fb607e16fe5b4dc5e62ad07aae56dcb1c89c67dcd94121f818def433bab36b" Oct 01 13:48:46 crc kubenswrapper[4851]: E1001 13:48:46.512266 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2fb607e16fe5b4dc5e62ad07aae56dcb1c89c67dcd94121f818def433bab36b\": container with ID starting with e2fb607e16fe5b4dc5e62ad07aae56dcb1c89c67dcd94121f818def433bab36b not found: ID does not exist" containerID="e2fb607e16fe5b4dc5e62ad07aae56dcb1c89c67dcd94121f818def433bab36b" Oct 01 13:48:46 crc kubenswrapper[4851]: I1001 13:48:46.512335 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2fb607e16fe5b4dc5e62ad07aae56dcb1c89c67dcd94121f818def433bab36b"} err="failed to get container status \"e2fb607e16fe5b4dc5e62ad07aae56dcb1c89c67dcd94121f818def433bab36b\": rpc error: code = NotFound desc = could not find container \"e2fb607e16fe5b4dc5e62ad07aae56dcb1c89c67dcd94121f818def433bab36b\": container with ID starting with e2fb607e16fe5b4dc5e62ad07aae56dcb1c89c67dcd94121f818def433bab36b not found: ID does not exist" Oct 01 13:48:48 crc kubenswrapper[4851]: I1001 13:48:48.347668 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="440f37e6-2b14-40af-8f84-b0576d502166" path="/var/lib/kubelet/pods/440f37e6-2b14-40af-8f84-b0576d502166/volumes" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.603142 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t857v"] Oct 01 13:50:04 crc kubenswrapper[4851]: E1001 13:50:04.604200 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440f37e6-2b14-40af-8f84-b0576d502166" containerName="registry-server" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.604219 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="440f37e6-2b14-40af-8f84-b0576d502166" containerName="registry-server" Oct 01 13:50:04 crc kubenswrapper[4851]: E1001 13:50:04.604268 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440f37e6-2b14-40af-8f84-b0576d502166" containerName="extract-content" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.604277 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="440f37e6-2b14-40af-8f84-b0576d502166" containerName="extract-content" Oct 01 13:50:04 crc kubenswrapper[4851]: E1001 13:50:04.604296 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440f37e6-2b14-40af-8f84-b0576d502166" containerName="extract-utilities" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.604305 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="440f37e6-2b14-40af-8f84-b0576d502166" containerName="extract-utilities" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.604613 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="440f37e6-2b14-40af-8f84-b0576d502166" containerName="registry-server" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.606401 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.618673 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t857v"] Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.729989 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935b2a6b-01c9-42d0-9fde-360e12d0e909-utilities\") pod \"redhat-operators-t857v\" (UID: \"935b2a6b-01c9-42d0-9fde-360e12d0e909\") " pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.730294 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935b2a6b-01c9-42d0-9fde-360e12d0e909-catalog-content\") pod \"redhat-operators-t857v\" (UID: \"935b2a6b-01c9-42d0-9fde-360e12d0e909\") " pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.730487 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-624dl\" (UniqueName: \"kubernetes.io/projected/935b2a6b-01c9-42d0-9fde-360e12d0e909-kube-api-access-624dl\") pod \"redhat-operators-t857v\" (UID: \"935b2a6b-01c9-42d0-9fde-360e12d0e909\") " pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.833358 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935b2a6b-01c9-42d0-9fde-360e12d0e909-utilities\") pod \"redhat-operators-t857v\" (UID: \"935b2a6b-01c9-42d0-9fde-360e12d0e909\") " pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.833646 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935b2a6b-01c9-42d0-9fde-360e12d0e909-catalog-content\") pod \"redhat-operators-t857v\" (UID: \"935b2a6b-01c9-42d0-9fde-360e12d0e909\") " pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.833720 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-624dl\" (UniqueName: \"kubernetes.io/projected/935b2a6b-01c9-42d0-9fde-360e12d0e909-kube-api-access-624dl\") pod \"redhat-operators-t857v\" (UID: \"935b2a6b-01c9-42d0-9fde-360e12d0e909\") " pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.834263 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935b2a6b-01c9-42d0-9fde-360e12d0e909-utilities\") pod \"redhat-operators-t857v\" (UID: \"935b2a6b-01c9-42d0-9fde-360e12d0e909\") " pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.834318 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935b2a6b-01c9-42d0-9fde-360e12d0e909-catalog-content\") pod \"redhat-operators-t857v\" (UID: \"935b2a6b-01c9-42d0-9fde-360e12d0e909\") " pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.865904 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-624dl\" (UniqueName: \"kubernetes.io/projected/935b2a6b-01c9-42d0-9fde-360e12d0e909-kube-api-access-624dl\") pod \"redhat-operators-t857v\" (UID: \"935b2a6b-01c9-42d0-9fde-360e12d0e909\") " pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:04 crc kubenswrapper[4851]: I1001 13:50:04.941099 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:05 crc kubenswrapper[4851]: I1001 13:50:05.430812 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t857v"] Oct 01 13:50:06 crc kubenswrapper[4851]: I1001 13:50:06.408700 4851 generic.go:334] "Generic (PLEG): container finished" podID="935b2a6b-01c9-42d0-9fde-360e12d0e909" containerID="d4a3fe8581e2faecaf6f6985c0c0a2d44fa84faae1897498591126198b80add1" exitCode=0 Oct 01 13:50:06 crc kubenswrapper[4851]: I1001 13:50:06.408818 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t857v" event={"ID":"935b2a6b-01c9-42d0-9fde-360e12d0e909","Type":"ContainerDied","Data":"d4a3fe8581e2faecaf6f6985c0c0a2d44fa84faae1897498591126198b80add1"} Oct 01 13:50:06 crc kubenswrapper[4851]: I1001 13:50:06.409126 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t857v" event={"ID":"935b2a6b-01c9-42d0-9fde-360e12d0e909","Type":"ContainerStarted","Data":"c501c0f48e474943b28e17435fbc4813c1a0b550b95d091500b3a33076ed13e7"} Oct 01 13:50:07 crc kubenswrapper[4851]: I1001 13:50:07.436609 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t857v" event={"ID":"935b2a6b-01c9-42d0-9fde-360e12d0e909","Type":"ContainerStarted","Data":"bdb2277522919a3d5f58c045a21e6a774527f6fb930642dc0abb05e976583e4f"} Oct 01 13:50:11 crc kubenswrapper[4851]: I1001 13:50:11.509972 4851 generic.go:334] "Generic (PLEG): container finished" podID="935b2a6b-01c9-42d0-9fde-360e12d0e909" containerID="bdb2277522919a3d5f58c045a21e6a774527f6fb930642dc0abb05e976583e4f" exitCode=0 Oct 01 13:50:11 crc kubenswrapper[4851]: I1001 13:50:11.510741 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t857v" event={"ID":"935b2a6b-01c9-42d0-9fde-360e12d0e909","Type":"ContainerDied","Data":"bdb2277522919a3d5f58c045a21e6a774527f6fb930642dc0abb05e976583e4f"} Oct 01 13:50:12 crc kubenswrapper[4851]: I1001 13:50:12.527366 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t857v" event={"ID":"935b2a6b-01c9-42d0-9fde-360e12d0e909","Type":"ContainerStarted","Data":"b8d0b7ad786ea17ae348d25982174f8c98560f1b3f3bd9031576fb3bb5e88332"} Oct 01 13:50:12 crc kubenswrapper[4851]: I1001 13:50:12.564935 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t857v" podStartSLOduration=3.057516246 podStartE2EDuration="8.564900768s" podCreationTimestamp="2025-10-01 13:50:04 +0000 UTC" firstStartedPulling="2025-10-01 13:50:06.418318748 +0000 UTC m=+3414.763436234" lastFinishedPulling="2025-10-01 13:50:11.92570323 +0000 UTC m=+3420.270820756" observedRunningTime="2025-10-01 13:50:12.553676068 +0000 UTC m=+3420.898793594" watchObservedRunningTime="2025-10-01 13:50:12.564900768 +0000 UTC m=+3420.910018294" Oct 01 13:50:14 crc kubenswrapper[4851]: I1001 13:50:14.941257 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:14 crc kubenswrapper[4851]: I1001 13:50:14.943132 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:15 crc kubenswrapper[4851]: I1001 13:50:15.997704 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t857v" podUID="935b2a6b-01c9-42d0-9fde-360e12d0e909" containerName="registry-server" probeResult="failure" output=< Oct 01 13:50:15 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 13:50:15 crc kubenswrapper[4851]: > Oct 01 13:50:26 crc kubenswrapper[4851]: I1001 13:50:26.028534 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t857v" podUID="935b2a6b-01c9-42d0-9fde-360e12d0e909" containerName="registry-server" probeResult="failure" output=< Oct 01 13:50:26 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 13:50:26 crc kubenswrapper[4851]: > Oct 01 13:50:30 crc kubenswrapper[4851]: I1001 13:50:30.050969 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:50:30 crc kubenswrapper[4851]: I1001 13:50:30.055671 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:50:34 crc kubenswrapper[4851]: I1001 13:50:34.999286 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:35 crc kubenswrapper[4851]: I1001 13:50:35.077000 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:35 crc kubenswrapper[4851]: I1001 13:50:35.809747 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t857v"] Oct 01 13:50:36 crc kubenswrapper[4851]: I1001 13:50:36.809535 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t857v" podUID="935b2a6b-01c9-42d0-9fde-360e12d0e909" containerName="registry-server" containerID="cri-o://b8d0b7ad786ea17ae348d25982174f8c98560f1b3f3bd9031576fb3bb5e88332" gracePeriod=2 Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.441376 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.451362 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-624dl\" (UniqueName: \"kubernetes.io/projected/935b2a6b-01c9-42d0-9fde-360e12d0e909-kube-api-access-624dl\") pod \"935b2a6b-01c9-42d0-9fde-360e12d0e909\" (UID: \"935b2a6b-01c9-42d0-9fde-360e12d0e909\") " Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.451455 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935b2a6b-01c9-42d0-9fde-360e12d0e909-catalog-content\") pod \"935b2a6b-01c9-42d0-9fde-360e12d0e909\" (UID: \"935b2a6b-01c9-42d0-9fde-360e12d0e909\") " Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.451523 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935b2a6b-01c9-42d0-9fde-360e12d0e909-utilities\") pod \"935b2a6b-01c9-42d0-9fde-360e12d0e909\" (UID: \"935b2a6b-01c9-42d0-9fde-360e12d0e909\") " Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.453076 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/935b2a6b-01c9-42d0-9fde-360e12d0e909-utilities" (OuterVolumeSpecName: "utilities") pod "935b2a6b-01c9-42d0-9fde-360e12d0e909" (UID: "935b2a6b-01c9-42d0-9fde-360e12d0e909"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.466008 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/935b2a6b-01c9-42d0-9fde-360e12d0e909-kube-api-access-624dl" (OuterVolumeSpecName: "kube-api-access-624dl") pod "935b2a6b-01c9-42d0-9fde-360e12d0e909" (UID: "935b2a6b-01c9-42d0-9fde-360e12d0e909"). InnerVolumeSpecName "kube-api-access-624dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.553555 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-624dl\" (UniqueName: \"kubernetes.io/projected/935b2a6b-01c9-42d0-9fde-360e12d0e909-kube-api-access-624dl\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.553585 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935b2a6b-01c9-42d0-9fde-360e12d0e909-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.581461 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/935b2a6b-01c9-42d0-9fde-360e12d0e909-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "935b2a6b-01c9-42d0-9fde-360e12d0e909" (UID: "935b2a6b-01c9-42d0-9fde-360e12d0e909"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.654730 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935b2a6b-01c9-42d0-9fde-360e12d0e909-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.824401 4851 generic.go:334] "Generic (PLEG): container finished" podID="935b2a6b-01c9-42d0-9fde-360e12d0e909" containerID="b8d0b7ad786ea17ae348d25982174f8c98560f1b3f3bd9031576fb3bb5e88332" exitCode=0 Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.824447 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t857v" event={"ID":"935b2a6b-01c9-42d0-9fde-360e12d0e909","Type":"ContainerDied","Data":"b8d0b7ad786ea17ae348d25982174f8c98560f1b3f3bd9031576fb3bb5e88332"} Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.824480 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t857v" event={"ID":"935b2a6b-01c9-42d0-9fde-360e12d0e909","Type":"ContainerDied","Data":"c501c0f48e474943b28e17435fbc4813c1a0b550b95d091500b3a33076ed13e7"} Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.824487 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t857v" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.824525 4851 scope.go:117] "RemoveContainer" containerID="b8d0b7ad786ea17ae348d25982174f8c98560f1b3f3bd9031576fb3bb5e88332" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.870708 4851 scope.go:117] "RemoveContainer" containerID="bdb2277522919a3d5f58c045a21e6a774527f6fb930642dc0abb05e976583e4f" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.875530 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t857v"] Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.883973 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t857v"] Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.908472 4851 scope.go:117] "RemoveContainer" containerID="d4a3fe8581e2faecaf6f6985c0c0a2d44fa84faae1897498591126198b80add1" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.981609 4851 scope.go:117] "RemoveContainer" containerID="b8d0b7ad786ea17ae348d25982174f8c98560f1b3f3bd9031576fb3bb5e88332" Oct 01 13:50:37 crc kubenswrapper[4851]: E1001 13:50:37.982320 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8d0b7ad786ea17ae348d25982174f8c98560f1b3f3bd9031576fb3bb5e88332\": container with ID starting with b8d0b7ad786ea17ae348d25982174f8c98560f1b3f3bd9031576fb3bb5e88332 not found: ID does not exist" containerID="b8d0b7ad786ea17ae348d25982174f8c98560f1b3f3bd9031576fb3bb5e88332" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.982375 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d0b7ad786ea17ae348d25982174f8c98560f1b3f3bd9031576fb3bb5e88332"} err="failed to get container status \"b8d0b7ad786ea17ae348d25982174f8c98560f1b3f3bd9031576fb3bb5e88332\": rpc error: code = NotFound desc = could not find container \"b8d0b7ad786ea17ae348d25982174f8c98560f1b3f3bd9031576fb3bb5e88332\": container with ID starting with b8d0b7ad786ea17ae348d25982174f8c98560f1b3f3bd9031576fb3bb5e88332 not found: ID does not exist" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.982418 4851 scope.go:117] "RemoveContainer" containerID="bdb2277522919a3d5f58c045a21e6a774527f6fb930642dc0abb05e976583e4f" Oct 01 13:50:37 crc kubenswrapper[4851]: E1001 13:50:37.983320 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdb2277522919a3d5f58c045a21e6a774527f6fb930642dc0abb05e976583e4f\": container with ID starting with bdb2277522919a3d5f58c045a21e6a774527f6fb930642dc0abb05e976583e4f not found: ID does not exist" containerID="bdb2277522919a3d5f58c045a21e6a774527f6fb930642dc0abb05e976583e4f" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.983409 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb2277522919a3d5f58c045a21e6a774527f6fb930642dc0abb05e976583e4f"} err="failed to get container status \"bdb2277522919a3d5f58c045a21e6a774527f6fb930642dc0abb05e976583e4f\": rpc error: code = NotFound desc = could not find container \"bdb2277522919a3d5f58c045a21e6a774527f6fb930642dc0abb05e976583e4f\": container with ID starting with bdb2277522919a3d5f58c045a21e6a774527f6fb930642dc0abb05e976583e4f not found: ID does not exist" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.983433 4851 scope.go:117] "RemoveContainer" containerID="d4a3fe8581e2faecaf6f6985c0c0a2d44fa84faae1897498591126198b80add1" Oct 01 13:50:37 crc kubenswrapper[4851]: E1001 13:50:37.984075 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4a3fe8581e2faecaf6f6985c0c0a2d44fa84faae1897498591126198b80add1\": container with ID starting with d4a3fe8581e2faecaf6f6985c0c0a2d44fa84faae1897498591126198b80add1 not found: ID does not exist" containerID="d4a3fe8581e2faecaf6f6985c0c0a2d44fa84faae1897498591126198b80add1" Oct 01 13:50:37 crc kubenswrapper[4851]: I1001 13:50:37.984160 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4a3fe8581e2faecaf6f6985c0c0a2d44fa84faae1897498591126198b80add1"} err="failed to get container status \"d4a3fe8581e2faecaf6f6985c0c0a2d44fa84faae1897498591126198b80add1\": rpc error: code = NotFound desc = could not find container \"d4a3fe8581e2faecaf6f6985c0c0a2d44fa84faae1897498591126198b80add1\": container with ID starting with d4a3fe8581e2faecaf6f6985c0c0a2d44fa84faae1897498591126198b80add1 not found: ID does not exist" Oct 01 13:50:38 crc kubenswrapper[4851]: I1001 13:50:38.349587 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="935b2a6b-01c9-42d0-9fde-360e12d0e909" path="/var/lib/kubelet/pods/935b2a6b-01c9-42d0-9fde-360e12d0e909/volumes" Oct 01 13:51:00 crc kubenswrapper[4851]: I1001 13:51:00.050861 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:51:00 crc kubenswrapper[4851]: I1001 13:51:00.051556 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:51:30 crc kubenswrapper[4851]: I1001 13:51:30.050859 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:51:30 crc kubenswrapper[4851]: I1001 13:51:30.051562 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:51:30 crc kubenswrapper[4851]: I1001 13:51:30.051632 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 13:51:30 crc kubenswrapper[4851]: I1001 13:51:30.052787 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bccfcb5cc69485d92b9238f696337dd72fc872b5406e4835ae7fbd5bfadb5238"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:51:30 crc kubenswrapper[4851]: I1001 13:51:30.052898 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://bccfcb5cc69485d92b9238f696337dd72fc872b5406e4835ae7fbd5bfadb5238" gracePeriod=600 Oct 01 13:51:30 crc kubenswrapper[4851]: I1001 13:51:30.456731 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="bccfcb5cc69485d92b9238f696337dd72fc872b5406e4835ae7fbd5bfadb5238" exitCode=0 Oct 01 13:51:30 crc kubenswrapper[4851]: I1001 13:51:30.456796 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"bccfcb5cc69485d92b9238f696337dd72fc872b5406e4835ae7fbd5bfadb5238"} Oct 01 13:51:30 crc kubenswrapper[4851]: I1001 13:51:30.457138 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe"} Oct 01 13:51:30 crc kubenswrapper[4851]: I1001 13:51:30.457167 4851 scope.go:117] "RemoveContainer" containerID="4af5e045d4edbc9e19dee409dbcd84231d14a35c15581dfc6c9fd20b4c62105f" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.184752 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wkbtb"] Oct 01 13:53:02 crc kubenswrapper[4851]: E1001 13:53:02.194327 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935b2a6b-01c9-42d0-9fde-360e12d0e909" containerName="registry-server" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.194388 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="935b2a6b-01c9-42d0-9fde-360e12d0e909" containerName="registry-server" Oct 01 13:53:02 crc kubenswrapper[4851]: E1001 13:53:02.194456 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935b2a6b-01c9-42d0-9fde-360e12d0e909" containerName="extract-utilities" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.194472 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="935b2a6b-01c9-42d0-9fde-360e12d0e909" containerName="extract-utilities" Oct 01 13:53:02 crc kubenswrapper[4851]: E1001 13:53:02.194541 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935b2a6b-01c9-42d0-9fde-360e12d0e909" containerName="extract-content" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.194555 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="935b2a6b-01c9-42d0-9fde-360e12d0e909" containerName="extract-content" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.195669 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="935b2a6b-01c9-42d0-9fde-360e12d0e909" containerName="registry-server" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.201593 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.239514 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wkbtb"] Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.381475 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6758l\" (UniqueName: \"kubernetes.io/projected/2d25af93-dc08-496f-9bc4-92f9e769246b-kube-api-access-6758l\") pod \"certified-operators-wkbtb\" (UID: \"2d25af93-dc08-496f-9bc4-92f9e769246b\") " pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.381548 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d25af93-dc08-496f-9bc4-92f9e769246b-catalog-content\") pod \"certified-operators-wkbtb\" (UID: \"2d25af93-dc08-496f-9bc4-92f9e769246b\") " pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.382034 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d25af93-dc08-496f-9bc4-92f9e769246b-utilities\") pod \"certified-operators-wkbtb\" (UID: \"2d25af93-dc08-496f-9bc4-92f9e769246b\") " pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.484768 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6758l\" (UniqueName: \"kubernetes.io/projected/2d25af93-dc08-496f-9bc4-92f9e769246b-kube-api-access-6758l\") pod \"certified-operators-wkbtb\" (UID: \"2d25af93-dc08-496f-9bc4-92f9e769246b\") " pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.485159 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d25af93-dc08-496f-9bc4-92f9e769246b-catalog-content\") pod \"certified-operators-wkbtb\" (UID: \"2d25af93-dc08-496f-9bc4-92f9e769246b\") " pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.485480 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d25af93-dc08-496f-9bc4-92f9e769246b-utilities\") pod \"certified-operators-wkbtb\" (UID: \"2d25af93-dc08-496f-9bc4-92f9e769246b\") " pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.486090 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d25af93-dc08-496f-9bc4-92f9e769246b-utilities\") pod \"certified-operators-wkbtb\" (UID: \"2d25af93-dc08-496f-9bc4-92f9e769246b\") " pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.486290 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d25af93-dc08-496f-9bc4-92f9e769246b-catalog-content\") pod \"certified-operators-wkbtb\" (UID: \"2d25af93-dc08-496f-9bc4-92f9e769246b\") " pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.508615 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6758l\" (UniqueName: \"kubernetes.io/projected/2d25af93-dc08-496f-9bc4-92f9e769246b-kube-api-access-6758l\") pod \"certified-operators-wkbtb\" (UID: \"2d25af93-dc08-496f-9bc4-92f9e769246b\") " pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:02 crc kubenswrapper[4851]: I1001 13:53:02.553253 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:03 crc kubenswrapper[4851]: I1001 13:53:03.189707 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wkbtb"] Oct 01 13:53:03 crc kubenswrapper[4851]: I1001 13:53:03.556109 4851 generic.go:334] "Generic (PLEG): container finished" podID="2d25af93-dc08-496f-9bc4-92f9e769246b" containerID="629dfa92e363214c4c39d617fe4943d182cdd37cca2af9a5cd0bfedf929394a6" exitCode=0 Oct 01 13:53:03 crc kubenswrapper[4851]: I1001 13:53:03.556164 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkbtb" event={"ID":"2d25af93-dc08-496f-9bc4-92f9e769246b","Type":"ContainerDied","Data":"629dfa92e363214c4c39d617fe4943d182cdd37cca2af9a5cd0bfedf929394a6"} Oct 01 13:53:03 crc kubenswrapper[4851]: I1001 13:53:03.556529 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkbtb" event={"ID":"2d25af93-dc08-496f-9bc4-92f9e769246b","Type":"ContainerStarted","Data":"fde906e5d6a16f3b8800986aa34948a4023b930044a10b45304f1ce07cb903a5"} Oct 01 13:53:04 crc kubenswrapper[4851]: I1001 13:53:04.568866 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkbtb" event={"ID":"2d25af93-dc08-496f-9bc4-92f9e769246b","Type":"ContainerStarted","Data":"d50b2935f4239c514e753804939177b847d7e48e7e3f203c9c355d767ddacaf8"} Oct 01 13:53:05 crc kubenswrapper[4851]: I1001 13:53:05.584598 4851 generic.go:334] "Generic (PLEG): container finished" podID="2d25af93-dc08-496f-9bc4-92f9e769246b" containerID="d50b2935f4239c514e753804939177b847d7e48e7e3f203c9c355d767ddacaf8" exitCode=0 Oct 01 13:53:05 crc kubenswrapper[4851]: I1001 13:53:05.584661 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkbtb" event={"ID":"2d25af93-dc08-496f-9bc4-92f9e769246b","Type":"ContainerDied","Data":"d50b2935f4239c514e753804939177b847d7e48e7e3f203c9c355d767ddacaf8"} Oct 01 13:53:06 crc kubenswrapper[4851]: I1001 13:53:06.599159 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkbtb" event={"ID":"2d25af93-dc08-496f-9bc4-92f9e769246b","Type":"ContainerStarted","Data":"c8c961eb955f99497683bc879d1943c1af5707a3a6e93dd8db713d8b6cac862c"} Oct 01 13:53:06 crc kubenswrapper[4851]: I1001 13:53:06.627668 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wkbtb" podStartSLOduration=2.156312223 podStartE2EDuration="4.627647803s" podCreationTimestamp="2025-10-01 13:53:02 +0000 UTC" firstStartedPulling="2025-10-01 13:53:03.558715808 +0000 UTC m=+3591.903833294" lastFinishedPulling="2025-10-01 13:53:06.030051368 +0000 UTC m=+3594.375168874" observedRunningTime="2025-10-01 13:53:06.621546529 +0000 UTC m=+3594.966664015" watchObservedRunningTime="2025-10-01 13:53:06.627647803 +0000 UTC m=+3594.972765279" Oct 01 13:53:12 crc kubenswrapper[4851]: I1001 13:53:12.553724 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:12 crc kubenswrapper[4851]: I1001 13:53:12.554246 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:12 crc kubenswrapper[4851]: I1001 13:53:12.643646 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:12 crc kubenswrapper[4851]: I1001 13:53:12.749814 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:13 crc kubenswrapper[4851]: I1001 13:53:13.370522 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wkbtb"] Oct 01 13:53:14 crc kubenswrapper[4851]: I1001 13:53:14.689040 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wkbtb" podUID="2d25af93-dc08-496f-9bc4-92f9e769246b" containerName="registry-server" containerID="cri-o://c8c961eb955f99497683bc879d1943c1af5707a3a6e93dd8db713d8b6cac862c" gracePeriod=2 Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.250995 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.382686 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6758l\" (UniqueName: \"kubernetes.io/projected/2d25af93-dc08-496f-9bc4-92f9e769246b-kube-api-access-6758l\") pod \"2d25af93-dc08-496f-9bc4-92f9e769246b\" (UID: \"2d25af93-dc08-496f-9bc4-92f9e769246b\") " Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.382815 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d25af93-dc08-496f-9bc4-92f9e769246b-utilities\") pod \"2d25af93-dc08-496f-9bc4-92f9e769246b\" (UID: \"2d25af93-dc08-496f-9bc4-92f9e769246b\") " Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.383148 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d25af93-dc08-496f-9bc4-92f9e769246b-catalog-content\") pod \"2d25af93-dc08-496f-9bc4-92f9e769246b\" (UID: \"2d25af93-dc08-496f-9bc4-92f9e769246b\") " Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.386120 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d25af93-dc08-496f-9bc4-92f9e769246b-utilities" (OuterVolumeSpecName: "utilities") pod "2d25af93-dc08-496f-9bc4-92f9e769246b" (UID: "2d25af93-dc08-496f-9bc4-92f9e769246b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.389936 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d25af93-dc08-496f-9bc4-92f9e769246b-kube-api-access-6758l" (OuterVolumeSpecName: "kube-api-access-6758l") pod "2d25af93-dc08-496f-9bc4-92f9e769246b" (UID: "2d25af93-dc08-496f-9bc4-92f9e769246b"). InnerVolumeSpecName "kube-api-access-6758l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.428386 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d25af93-dc08-496f-9bc4-92f9e769246b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d25af93-dc08-496f-9bc4-92f9e769246b" (UID: "2d25af93-dc08-496f-9bc4-92f9e769246b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.485309 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d25af93-dc08-496f-9bc4-92f9e769246b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.485368 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6758l\" (UniqueName: \"kubernetes.io/projected/2d25af93-dc08-496f-9bc4-92f9e769246b-kube-api-access-6758l\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.485387 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d25af93-dc08-496f-9bc4-92f9e769246b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.703990 4851 generic.go:334] "Generic (PLEG): container finished" podID="2d25af93-dc08-496f-9bc4-92f9e769246b" containerID="c8c961eb955f99497683bc879d1943c1af5707a3a6e93dd8db713d8b6cac862c" exitCode=0 Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.704046 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkbtb" event={"ID":"2d25af93-dc08-496f-9bc4-92f9e769246b","Type":"ContainerDied","Data":"c8c961eb955f99497683bc879d1943c1af5707a3a6e93dd8db713d8b6cac862c"} Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.704087 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkbtb" event={"ID":"2d25af93-dc08-496f-9bc4-92f9e769246b","Type":"ContainerDied","Data":"fde906e5d6a16f3b8800986aa34948a4023b930044a10b45304f1ce07cb903a5"} Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.704118 4851 scope.go:117] "RemoveContainer" containerID="c8c961eb955f99497683bc879d1943c1af5707a3a6e93dd8db713d8b6cac862c" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.704134 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkbtb" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.736872 4851 scope.go:117] "RemoveContainer" containerID="d50b2935f4239c514e753804939177b847d7e48e7e3f203c9c355d767ddacaf8" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.753450 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wkbtb"] Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.760980 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wkbtb"] Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.773409 4851 scope.go:117] "RemoveContainer" containerID="629dfa92e363214c4c39d617fe4943d182cdd37cca2af9a5cd0bfedf929394a6" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.851605 4851 scope.go:117] "RemoveContainer" containerID="c8c961eb955f99497683bc879d1943c1af5707a3a6e93dd8db713d8b6cac862c" Oct 01 13:53:15 crc kubenswrapper[4851]: E1001 13:53:15.852691 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8c961eb955f99497683bc879d1943c1af5707a3a6e93dd8db713d8b6cac862c\": container with ID starting with c8c961eb955f99497683bc879d1943c1af5707a3a6e93dd8db713d8b6cac862c not found: ID does not exist" containerID="c8c961eb955f99497683bc879d1943c1af5707a3a6e93dd8db713d8b6cac862c" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.852740 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8c961eb955f99497683bc879d1943c1af5707a3a6e93dd8db713d8b6cac862c"} err="failed to get container status \"c8c961eb955f99497683bc879d1943c1af5707a3a6e93dd8db713d8b6cac862c\": rpc error: code = NotFound desc = could not find container \"c8c961eb955f99497683bc879d1943c1af5707a3a6e93dd8db713d8b6cac862c\": container with ID starting with c8c961eb955f99497683bc879d1943c1af5707a3a6e93dd8db713d8b6cac862c not found: ID does not exist" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.852771 4851 scope.go:117] "RemoveContainer" containerID="d50b2935f4239c514e753804939177b847d7e48e7e3f203c9c355d767ddacaf8" Oct 01 13:53:15 crc kubenswrapper[4851]: E1001 13:53:15.853207 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d50b2935f4239c514e753804939177b847d7e48e7e3f203c9c355d767ddacaf8\": container with ID starting with d50b2935f4239c514e753804939177b847d7e48e7e3f203c9c355d767ddacaf8 not found: ID does not exist" containerID="d50b2935f4239c514e753804939177b847d7e48e7e3f203c9c355d767ddacaf8" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.853236 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d50b2935f4239c514e753804939177b847d7e48e7e3f203c9c355d767ddacaf8"} err="failed to get container status \"d50b2935f4239c514e753804939177b847d7e48e7e3f203c9c355d767ddacaf8\": rpc error: code = NotFound desc = could not find container \"d50b2935f4239c514e753804939177b847d7e48e7e3f203c9c355d767ddacaf8\": container with ID starting with d50b2935f4239c514e753804939177b847d7e48e7e3f203c9c355d767ddacaf8 not found: ID does not exist" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.853255 4851 scope.go:117] "RemoveContainer" containerID="629dfa92e363214c4c39d617fe4943d182cdd37cca2af9a5cd0bfedf929394a6" Oct 01 13:53:15 crc kubenswrapper[4851]: E1001 13:53:15.853800 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"629dfa92e363214c4c39d617fe4943d182cdd37cca2af9a5cd0bfedf929394a6\": container with ID starting with 629dfa92e363214c4c39d617fe4943d182cdd37cca2af9a5cd0bfedf929394a6 not found: ID does not exist" containerID="629dfa92e363214c4c39d617fe4943d182cdd37cca2af9a5cd0bfedf929394a6" Oct 01 13:53:15 crc kubenswrapper[4851]: I1001 13:53:15.853833 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629dfa92e363214c4c39d617fe4943d182cdd37cca2af9a5cd0bfedf929394a6"} err="failed to get container status \"629dfa92e363214c4c39d617fe4943d182cdd37cca2af9a5cd0bfedf929394a6\": rpc error: code = NotFound desc = could not find container \"629dfa92e363214c4c39d617fe4943d182cdd37cca2af9a5cd0bfedf929394a6\": container with ID starting with 629dfa92e363214c4c39d617fe4943d182cdd37cca2af9a5cd0bfedf929394a6 not found: ID does not exist" Oct 01 13:53:16 crc kubenswrapper[4851]: I1001 13:53:16.343004 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d25af93-dc08-496f-9bc4-92f9e769246b" path="/var/lib/kubelet/pods/2d25af93-dc08-496f-9bc4-92f9e769246b/volumes" Oct 01 13:53:30 crc kubenswrapper[4851]: I1001 13:53:30.049955 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:53:30 crc kubenswrapper[4851]: I1001 13:53:30.050833 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:54:00 crc kubenswrapper[4851]: I1001 13:54:00.051777 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:54:00 crc kubenswrapper[4851]: I1001 13:54:00.052439 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:54:30 crc kubenswrapper[4851]: I1001 13:54:30.049816 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:54:30 crc kubenswrapper[4851]: I1001 13:54:30.051824 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:54:30 crc kubenswrapper[4851]: I1001 13:54:30.052006 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 13:54:30 crc kubenswrapper[4851]: I1001 13:54:30.053114 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:54:30 crc kubenswrapper[4851]: I1001 13:54:30.053383 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" gracePeriod=600 Oct 01 13:54:30 crc kubenswrapper[4851]: E1001 13:54:30.180925 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:54:30 crc kubenswrapper[4851]: I1001 13:54:30.658750 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" exitCode=0 Oct 01 13:54:30 crc kubenswrapper[4851]: I1001 13:54:30.658857 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe"} Oct 01 13:54:30 crc kubenswrapper[4851]: I1001 13:54:30.659314 4851 scope.go:117] "RemoveContainer" containerID="bccfcb5cc69485d92b9238f696337dd72fc872b5406e4835ae7fbd5bfadb5238" Oct 01 13:54:30 crc kubenswrapper[4851]: I1001 13:54:30.662195 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:54:30 crc kubenswrapper[4851]: E1001 13:54:30.664188 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:54:44 crc kubenswrapper[4851]: I1001 13:54:44.328821 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:54:44 crc kubenswrapper[4851]: E1001 13:54:44.329984 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:54:57 crc kubenswrapper[4851]: I1001 13:54:57.330191 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:54:57 crc kubenswrapper[4851]: E1001 13:54:57.331478 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:55:10 crc kubenswrapper[4851]: I1001 13:55:10.329534 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:55:10 crc kubenswrapper[4851]: E1001 13:55:10.330538 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:55:22 crc kubenswrapper[4851]: I1001 13:55:22.337335 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:55:22 crc kubenswrapper[4851]: E1001 13:55:22.338431 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:55:35 crc kubenswrapper[4851]: I1001 13:55:35.329302 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:55:35 crc kubenswrapper[4851]: E1001 13:55:35.330533 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:55:48 crc kubenswrapper[4851]: I1001 13:55:48.328842 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:55:48 crc kubenswrapper[4851]: E1001 13:55:48.329763 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:56:01 crc kubenswrapper[4851]: I1001 13:56:01.328603 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:56:01 crc kubenswrapper[4851]: E1001 13:56:01.329844 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:56:16 crc kubenswrapper[4851]: I1001 13:56:16.332170 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:56:16 crc kubenswrapper[4851]: E1001 13:56:16.335884 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.601058 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wmkqj"] Oct 01 13:56:28 crc kubenswrapper[4851]: E1001 13:56:28.602110 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d25af93-dc08-496f-9bc4-92f9e769246b" containerName="registry-server" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.602126 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d25af93-dc08-496f-9bc4-92f9e769246b" containerName="registry-server" Oct 01 13:56:28 crc kubenswrapper[4851]: E1001 13:56:28.602160 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d25af93-dc08-496f-9bc4-92f9e769246b" containerName="extract-utilities" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.602168 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d25af93-dc08-496f-9bc4-92f9e769246b" containerName="extract-utilities" Oct 01 13:56:28 crc kubenswrapper[4851]: E1001 13:56:28.602184 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d25af93-dc08-496f-9bc4-92f9e769246b" containerName="extract-content" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.602191 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d25af93-dc08-496f-9bc4-92f9e769246b" containerName="extract-content" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.602390 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d25af93-dc08-496f-9bc4-92f9e769246b" containerName="registry-server" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.603869 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.623461 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmkqj"] Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.655920 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-catalog-content\") pod \"community-operators-wmkqj\" (UID: \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\") " pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.655962 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-utilities\") pod \"community-operators-wmkqj\" (UID: \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\") " pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.655997 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dptsc\" (UniqueName: \"kubernetes.io/projected/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-kube-api-access-dptsc\") pod \"community-operators-wmkqj\" (UID: \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\") " pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.758068 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-utilities\") pod \"community-operators-wmkqj\" (UID: \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\") " pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.758143 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dptsc\" (UniqueName: \"kubernetes.io/projected/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-kube-api-access-dptsc\") pod \"community-operators-wmkqj\" (UID: \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\") " pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.758338 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-catalog-content\") pod \"community-operators-wmkqj\" (UID: \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\") " pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.758649 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-utilities\") pod \"community-operators-wmkqj\" (UID: \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\") " pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.758739 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-catalog-content\") pod \"community-operators-wmkqj\" (UID: \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\") " pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.789706 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dptsc\" (UniqueName: \"kubernetes.io/projected/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-kube-api-access-dptsc\") pod \"community-operators-wmkqj\" (UID: \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\") " pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:28 crc kubenswrapper[4851]: I1001 13:56:28.938663 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:29 crc kubenswrapper[4851]: I1001 13:56:29.497124 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmkqj"] Oct 01 13:56:29 crc kubenswrapper[4851]: W1001 13:56:29.505224 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2efdac77_86d0_4a34_a3dc_cbfd15b0ba9c.slice/crio-0574e060804be3f548d533cdfa6b2b975fe59ab7ff5ba9fe17dc416b492e2afc WatchSource:0}: Error finding container 0574e060804be3f548d533cdfa6b2b975fe59ab7ff5ba9fe17dc416b492e2afc: Status 404 returned error can't find the container with id 0574e060804be3f548d533cdfa6b2b975fe59ab7ff5ba9fe17dc416b492e2afc Oct 01 13:56:30 crc kubenswrapper[4851]: I1001 13:56:30.062038 4851 generic.go:334] "Generic (PLEG): container finished" podID="2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" containerID="b1605d3e79c6e3983b235f35cc72a79dae4ccf5372e6d8716c20e08105984111" exitCode=0 Oct 01 13:56:30 crc kubenswrapper[4851]: I1001 13:56:30.062220 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmkqj" event={"ID":"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c","Type":"ContainerDied","Data":"b1605d3e79c6e3983b235f35cc72a79dae4ccf5372e6d8716c20e08105984111"} Oct 01 13:56:30 crc kubenswrapper[4851]: I1001 13:56:30.062346 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmkqj" event={"ID":"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c","Type":"ContainerStarted","Data":"0574e060804be3f548d533cdfa6b2b975fe59ab7ff5ba9fe17dc416b492e2afc"} Oct 01 13:56:30 crc kubenswrapper[4851]: I1001 13:56:30.066144 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:56:31 crc kubenswrapper[4851]: I1001 13:56:31.328390 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:56:31 crc kubenswrapper[4851]: E1001 13:56:31.329118 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:56:32 crc kubenswrapper[4851]: I1001 13:56:32.086166 4851 generic.go:334] "Generic (PLEG): container finished" podID="2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" containerID="e41550f06544d3526c0bdc97d8494e440aaab320067fde5e02e523c205e989aa" exitCode=0 Oct 01 13:56:32 crc kubenswrapper[4851]: I1001 13:56:32.086255 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmkqj" event={"ID":"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c","Type":"ContainerDied","Data":"e41550f06544d3526c0bdc97d8494e440aaab320067fde5e02e523c205e989aa"} Oct 01 13:56:34 crc kubenswrapper[4851]: I1001 13:56:34.116325 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmkqj" event={"ID":"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c","Type":"ContainerStarted","Data":"32f5a8861fc7d6b7bddb744d6898725da6499ff7f7b366562534bf486abd1d6b"} Oct 01 13:56:34 crc kubenswrapper[4851]: I1001 13:56:34.151091 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wmkqj" podStartSLOduration=3.235739686 podStartE2EDuration="6.151068558s" podCreationTimestamp="2025-10-01 13:56:28 +0000 UTC" firstStartedPulling="2025-10-01 13:56:30.065943621 +0000 UTC m=+3798.411061107" lastFinishedPulling="2025-10-01 13:56:32.981272463 +0000 UTC m=+3801.326389979" observedRunningTime="2025-10-01 13:56:34.145721546 +0000 UTC m=+3802.490839052" watchObservedRunningTime="2025-10-01 13:56:34.151068558 +0000 UTC m=+3802.496186054" Oct 01 13:56:38 crc kubenswrapper[4851]: I1001 13:56:38.939546 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:38 crc kubenswrapper[4851]: I1001 13:56:38.940359 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:39 crc kubenswrapper[4851]: I1001 13:56:39.018161 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:39 crc kubenswrapper[4851]: I1001 13:56:39.252152 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:39 crc kubenswrapper[4851]: I1001 13:56:39.369181 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmkqj"] Oct 01 13:56:41 crc kubenswrapper[4851]: I1001 13:56:41.188610 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wmkqj" podUID="2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" containerName="registry-server" containerID="cri-o://32f5a8861fc7d6b7bddb744d6898725da6499ff7f7b366562534bf486abd1d6b" gracePeriod=2 Oct 01 13:56:41 crc kubenswrapper[4851]: I1001 13:56:41.807048 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:41 crc kubenswrapper[4851]: I1001 13:56:41.875153 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-utilities\") pod \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\" (UID: \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\") " Oct 01 13:56:41 crc kubenswrapper[4851]: I1001 13:56:41.875253 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dptsc\" (UniqueName: \"kubernetes.io/projected/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-kube-api-access-dptsc\") pod \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\" (UID: \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\") " Oct 01 13:56:41 crc kubenswrapper[4851]: I1001 13:56:41.876434 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-catalog-content\") pod \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\" (UID: \"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c\") " Oct 01 13:56:41 crc kubenswrapper[4851]: I1001 13:56:41.876908 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-utilities" (OuterVolumeSpecName: "utilities") pod "2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" (UID: "2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:56:41 crc kubenswrapper[4851]: I1001 13:56:41.882071 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-kube-api-access-dptsc" (OuterVolumeSpecName: "kube-api-access-dptsc") pod "2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" (UID: "2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c"). InnerVolumeSpecName "kube-api-access-dptsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:56:41 crc kubenswrapper[4851]: I1001 13:56:41.924195 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" (UID: "2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:56:41 crc kubenswrapper[4851]: I1001 13:56:41.978451 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:41 crc kubenswrapper[4851]: I1001 13:56:41.978485 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dptsc\" (UniqueName: \"kubernetes.io/projected/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-kube-api-access-dptsc\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:41 crc kubenswrapper[4851]: I1001 13:56:41.978496 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.202492 4851 generic.go:334] "Generic (PLEG): container finished" podID="2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" containerID="32f5a8861fc7d6b7bddb744d6898725da6499ff7f7b366562534bf486abd1d6b" exitCode=0 Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.202573 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmkqj" event={"ID":"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c","Type":"ContainerDied","Data":"32f5a8861fc7d6b7bddb744d6898725da6499ff7f7b366562534bf486abd1d6b"} Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.202603 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmkqj" Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.202641 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmkqj" event={"ID":"2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c","Type":"ContainerDied","Data":"0574e060804be3f548d533cdfa6b2b975fe59ab7ff5ba9fe17dc416b492e2afc"} Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.202677 4851 scope.go:117] "RemoveContainer" containerID="32f5a8861fc7d6b7bddb744d6898725da6499ff7f7b366562534bf486abd1d6b" Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.227850 4851 scope.go:117] "RemoveContainer" containerID="e41550f06544d3526c0bdc97d8494e440aaab320067fde5e02e523c205e989aa" Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.263202 4851 scope.go:117] "RemoveContainer" containerID="b1605d3e79c6e3983b235f35cc72a79dae4ccf5372e6d8716c20e08105984111" Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.268271 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmkqj"] Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.282395 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wmkqj"] Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.319497 4851 scope.go:117] "RemoveContainer" containerID="32f5a8861fc7d6b7bddb744d6898725da6499ff7f7b366562534bf486abd1d6b" Oct 01 13:56:42 crc kubenswrapper[4851]: E1001 13:56:42.320820 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32f5a8861fc7d6b7bddb744d6898725da6499ff7f7b366562534bf486abd1d6b\": container with ID starting with 32f5a8861fc7d6b7bddb744d6898725da6499ff7f7b366562534bf486abd1d6b not found: ID does not exist" containerID="32f5a8861fc7d6b7bddb744d6898725da6499ff7f7b366562534bf486abd1d6b" Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.320883 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f5a8861fc7d6b7bddb744d6898725da6499ff7f7b366562534bf486abd1d6b"} err="failed to get container status \"32f5a8861fc7d6b7bddb744d6898725da6499ff7f7b366562534bf486abd1d6b\": rpc error: code = NotFound desc = could not find container \"32f5a8861fc7d6b7bddb744d6898725da6499ff7f7b366562534bf486abd1d6b\": container with ID starting with 32f5a8861fc7d6b7bddb744d6898725da6499ff7f7b366562534bf486abd1d6b not found: ID does not exist" Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.320924 4851 scope.go:117] "RemoveContainer" containerID="e41550f06544d3526c0bdc97d8494e440aaab320067fde5e02e523c205e989aa" Oct 01 13:56:42 crc kubenswrapper[4851]: E1001 13:56:42.321323 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e41550f06544d3526c0bdc97d8494e440aaab320067fde5e02e523c205e989aa\": container with ID starting with e41550f06544d3526c0bdc97d8494e440aaab320067fde5e02e523c205e989aa not found: ID does not exist" containerID="e41550f06544d3526c0bdc97d8494e440aaab320067fde5e02e523c205e989aa" Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.321357 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e41550f06544d3526c0bdc97d8494e440aaab320067fde5e02e523c205e989aa"} err="failed to get container status \"e41550f06544d3526c0bdc97d8494e440aaab320067fde5e02e523c205e989aa\": rpc error: code = NotFound desc = could not find container \"e41550f06544d3526c0bdc97d8494e440aaab320067fde5e02e523c205e989aa\": container with ID starting with e41550f06544d3526c0bdc97d8494e440aaab320067fde5e02e523c205e989aa not found: ID does not exist" Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.321377 4851 scope.go:117] "RemoveContainer" containerID="b1605d3e79c6e3983b235f35cc72a79dae4ccf5372e6d8716c20e08105984111" Oct 01 13:56:42 crc kubenswrapper[4851]: E1001 13:56:42.321717 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1605d3e79c6e3983b235f35cc72a79dae4ccf5372e6d8716c20e08105984111\": container with ID starting with b1605d3e79c6e3983b235f35cc72a79dae4ccf5372e6d8716c20e08105984111 not found: ID does not exist" containerID="b1605d3e79c6e3983b235f35cc72a79dae4ccf5372e6d8716c20e08105984111" Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.321759 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1605d3e79c6e3983b235f35cc72a79dae4ccf5372e6d8716c20e08105984111"} err="failed to get container status \"b1605d3e79c6e3983b235f35cc72a79dae4ccf5372e6d8716c20e08105984111\": rpc error: code = NotFound desc = could not find container \"b1605d3e79c6e3983b235f35cc72a79dae4ccf5372e6d8716c20e08105984111\": container with ID starting with b1605d3e79c6e3983b235f35cc72a79dae4ccf5372e6d8716c20e08105984111 not found: ID does not exist" Oct 01 13:56:42 crc kubenswrapper[4851]: I1001 13:56:42.361128 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" path="/var/lib/kubelet/pods/2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c/volumes" Oct 01 13:56:46 crc kubenswrapper[4851]: I1001 13:56:46.328865 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:56:46 crc kubenswrapper[4851]: E1001 13:56:46.329859 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:56:59 crc kubenswrapper[4851]: I1001 13:56:59.329355 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:56:59 crc kubenswrapper[4851]: E1001 13:56:59.330286 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:57:14 crc kubenswrapper[4851]: I1001 13:57:14.328859 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:57:14 crc kubenswrapper[4851]: E1001 13:57:14.330052 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:57:26 crc kubenswrapper[4851]: I1001 13:57:26.328272 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:57:26 crc kubenswrapper[4851]: E1001 13:57:26.329260 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:57:37 crc kubenswrapper[4851]: I1001 13:57:37.328687 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:57:37 crc kubenswrapper[4851]: E1001 13:57:37.329517 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:57:48 crc kubenswrapper[4851]: I1001 13:57:48.329604 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:57:48 crc kubenswrapper[4851]: E1001 13:57:48.330390 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:58:03 crc kubenswrapper[4851]: I1001 13:58:03.328975 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:58:03 crc kubenswrapper[4851]: E1001 13:58:03.329741 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:58:18 crc kubenswrapper[4851]: I1001 13:58:18.328722 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:58:18 crc kubenswrapper[4851]: E1001 13:58:18.329907 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:58:29 crc kubenswrapper[4851]: I1001 13:58:29.328912 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:58:29 crc kubenswrapper[4851]: E1001 13:58:29.330438 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:58:43 crc kubenswrapper[4851]: I1001 13:58:43.328312 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:58:43 crc kubenswrapper[4851]: E1001 13:58:43.329383 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:58:56 crc kubenswrapper[4851]: I1001 13:58:56.328603 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:58:56 crc kubenswrapper[4851]: E1001 13:58:56.329873 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:59:09 crc kubenswrapper[4851]: I1001 13:59:09.328481 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:59:09 crc kubenswrapper[4851]: E1001 13:59:09.330750 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:59:23 crc kubenswrapper[4851]: I1001 13:59:23.328369 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:59:23 crc kubenswrapper[4851]: E1001 13:59:23.329807 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 13:59:34 crc kubenswrapper[4851]: I1001 13:59:34.329154 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 13:59:35 crc kubenswrapper[4851]: I1001 13:59:35.290185 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"8fe8ded7a2296c0a9a6846baf026cdf6a0bd0acd4bb0b8ed66d9e9879aebcdb9"} Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.253007 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9spvl"] Oct 01 13:59:57 crc kubenswrapper[4851]: E1001 13:59:57.254227 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" containerName="extract-utilities" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.254249 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" containerName="extract-utilities" Oct 01 13:59:57 crc kubenswrapper[4851]: E1001 13:59:57.254275 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" containerName="extract-content" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.254287 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" containerName="extract-content" Oct 01 13:59:57 crc kubenswrapper[4851]: E1001 13:59:57.254316 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" containerName="registry-server" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.254328 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" containerName="registry-server" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.254721 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="2efdac77-86d0-4a34-a3dc-cbfd15b0ba9c" containerName="registry-server" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.257180 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.279205 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9spvl"] Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.368563 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d50126c-0124-4929-967c-523cf31894a5-catalog-content\") pod \"redhat-marketplace-9spvl\" (UID: \"0d50126c-0124-4929-967c-523cf31894a5\") " pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.368736 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d50126c-0124-4929-967c-523cf31894a5-utilities\") pod \"redhat-marketplace-9spvl\" (UID: \"0d50126c-0124-4929-967c-523cf31894a5\") " pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.368828 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn9sg\" (UniqueName: \"kubernetes.io/projected/0d50126c-0124-4929-967c-523cf31894a5-kube-api-access-gn9sg\") pod \"redhat-marketplace-9spvl\" (UID: \"0d50126c-0124-4929-967c-523cf31894a5\") " pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.470443 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn9sg\" (UniqueName: \"kubernetes.io/projected/0d50126c-0124-4929-967c-523cf31894a5-kube-api-access-gn9sg\") pod \"redhat-marketplace-9spvl\" (UID: \"0d50126c-0124-4929-967c-523cf31894a5\") " pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.470637 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d50126c-0124-4929-967c-523cf31894a5-catalog-content\") pod \"redhat-marketplace-9spvl\" (UID: \"0d50126c-0124-4929-967c-523cf31894a5\") " pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.470667 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d50126c-0124-4929-967c-523cf31894a5-utilities\") pod \"redhat-marketplace-9spvl\" (UID: \"0d50126c-0124-4929-967c-523cf31894a5\") " pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.471193 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d50126c-0124-4929-967c-523cf31894a5-catalog-content\") pod \"redhat-marketplace-9spvl\" (UID: \"0d50126c-0124-4929-967c-523cf31894a5\") " pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.471321 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d50126c-0124-4929-967c-523cf31894a5-utilities\") pod \"redhat-marketplace-9spvl\" (UID: \"0d50126c-0124-4929-967c-523cf31894a5\") " pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.498892 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn9sg\" (UniqueName: \"kubernetes.io/projected/0d50126c-0124-4929-967c-523cf31894a5-kube-api-access-gn9sg\") pod \"redhat-marketplace-9spvl\" (UID: \"0d50126c-0124-4929-967c-523cf31894a5\") " pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 13:59:57 crc kubenswrapper[4851]: I1001 13:59:57.601050 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 13:59:58 crc kubenswrapper[4851]: I1001 13:59:58.115559 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9spvl"] Oct 01 13:59:58 crc kubenswrapper[4851]: W1001 13:59:58.123929 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d50126c_0124_4929_967c_523cf31894a5.slice/crio-430d1fae608cdc794dcb5d3f0a5112f85aeb7d4aa2c54369b4075c1a1104f50e WatchSource:0}: Error finding container 430d1fae608cdc794dcb5d3f0a5112f85aeb7d4aa2c54369b4075c1a1104f50e: Status 404 returned error can't find the container with id 430d1fae608cdc794dcb5d3f0a5112f85aeb7d4aa2c54369b4075c1a1104f50e Oct 01 13:59:58 crc kubenswrapper[4851]: I1001 13:59:58.562274 4851 generic.go:334] "Generic (PLEG): container finished" podID="0d50126c-0124-4929-967c-523cf31894a5" containerID="e510964bab2027c8dbf8caaa4d778cb4e258ba42d5ee5f53834aa70d7b0fca11" exitCode=0 Oct 01 13:59:58 crc kubenswrapper[4851]: I1001 13:59:58.562341 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9spvl" event={"ID":"0d50126c-0124-4929-967c-523cf31894a5","Type":"ContainerDied","Data":"e510964bab2027c8dbf8caaa4d778cb4e258ba42d5ee5f53834aa70d7b0fca11"} Oct 01 13:59:58 crc kubenswrapper[4851]: I1001 13:59:58.562686 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9spvl" event={"ID":"0d50126c-0124-4929-967c-523cf31894a5","Type":"ContainerStarted","Data":"430d1fae608cdc794dcb5d3f0a5112f85aeb7d4aa2c54369b4075c1a1104f50e"} Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.188487 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb"] Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.190823 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb"] Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.191027 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.205162 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.205249 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.337445 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv5bq\" (UniqueName: \"kubernetes.io/projected/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-kube-api-access-nv5bq\") pod \"collect-profiles-29322120-m9zzb\" (UID: \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.337889 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-config-volume\") pod \"collect-profiles-29322120-m9zzb\" (UID: \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.338017 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-secret-volume\") pod \"collect-profiles-29322120-m9zzb\" (UID: \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.439718 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv5bq\" (UniqueName: \"kubernetes.io/projected/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-kube-api-access-nv5bq\") pod \"collect-profiles-29322120-m9zzb\" (UID: \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.439850 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-config-volume\") pod \"collect-profiles-29322120-m9zzb\" (UID: \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.439895 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-secret-volume\") pod \"collect-profiles-29322120-m9zzb\" (UID: \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.441404 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-config-volume\") pod \"collect-profiles-29322120-m9zzb\" (UID: \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.454324 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-secret-volume\") pod \"collect-profiles-29322120-m9zzb\" (UID: \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.460034 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv5bq\" (UniqueName: \"kubernetes.io/projected/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-kube-api-access-nv5bq\") pod \"collect-profiles-29322120-m9zzb\" (UID: \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.548835 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.604407 4851 generic.go:334] "Generic (PLEG): container finished" podID="0d50126c-0124-4929-967c-523cf31894a5" containerID="2a3b8504481c3de8d558c2eafa90683ddeb0a056c5b46b5ea7b122dfc28503cd" exitCode=0 Oct 01 14:00:00 crc kubenswrapper[4851]: I1001 14:00:00.604467 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9spvl" event={"ID":"0d50126c-0124-4929-967c-523cf31894a5","Type":"ContainerDied","Data":"2a3b8504481c3de8d558c2eafa90683ddeb0a056c5b46b5ea7b122dfc28503cd"} Oct 01 14:00:01 crc kubenswrapper[4851]: I1001 14:00:01.034773 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb"] Oct 01 14:00:01 crc kubenswrapper[4851]: W1001 14:00:01.035355 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac388880_3f2a_4a6c_868c_3bca0c63e6c3.slice/crio-dc6fda5a4adf7ebc1d592d5528ac26ed17551aa6f120d7d79fce5fbb5b011089 WatchSource:0}: Error finding container dc6fda5a4adf7ebc1d592d5528ac26ed17551aa6f120d7d79fce5fbb5b011089: Status 404 returned error can't find the container with id dc6fda5a4adf7ebc1d592d5528ac26ed17551aa6f120d7d79fce5fbb5b011089 Oct 01 14:00:01 crc kubenswrapper[4851]: I1001 14:00:01.620253 4851 generic.go:334] "Generic (PLEG): container finished" podID="ac388880-3f2a-4a6c-868c-3bca0c63e6c3" containerID="3a435c68fd4ab84ed0f3737c30ce8f5d248bca6805200e12465545e62830cc0f" exitCode=0 Oct 01 14:00:01 crc kubenswrapper[4851]: I1001 14:00:01.620780 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" event={"ID":"ac388880-3f2a-4a6c-868c-3bca0c63e6c3","Type":"ContainerDied","Data":"3a435c68fd4ab84ed0f3737c30ce8f5d248bca6805200e12465545e62830cc0f"} Oct 01 14:00:01 crc kubenswrapper[4851]: I1001 14:00:01.620885 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" event={"ID":"ac388880-3f2a-4a6c-868c-3bca0c63e6c3","Type":"ContainerStarted","Data":"dc6fda5a4adf7ebc1d592d5528ac26ed17551aa6f120d7d79fce5fbb5b011089"} Oct 01 14:00:02 crc kubenswrapper[4851]: I1001 14:00:02.638021 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9spvl" event={"ID":"0d50126c-0124-4929-967c-523cf31894a5","Type":"ContainerStarted","Data":"ef891a100a2a0fe7fc438aa418a9d74fc035e21973f0eb0af11f13a4ce148ee3"} Oct 01 14:00:02 crc kubenswrapper[4851]: I1001 14:00:02.683769 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9spvl" podStartSLOduration=2.754401653 podStartE2EDuration="5.683742838s" podCreationTimestamp="2025-10-01 13:59:57 +0000 UTC" firstStartedPulling="2025-10-01 13:59:58.563881589 +0000 UTC m=+4006.908999115" lastFinishedPulling="2025-10-01 14:00:01.493222794 +0000 UTC m=+4009.838340300" observedRunningTime="2025-10-01 14:00:02.671240601 +0000 UTC m=+4011.016358107" watchObservedRunningTime="2025-10-01 14:00:02.683742838 +0000 UTC m=+4011.028860354" Oct 01 14:00:03 crc kubenswrapper[4851]: I1001 14:00:03.063377 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" Oct 01 14:00:03 crc kubenswrapper[4851]: I1001 14:00:03.196396 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-secret-volume\") pod \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\" (UID: \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\") " Oct 01 14:00:03 crc kubenswrapper[4851]: I1001 14:00:03.196790 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv5bq\" (UniqueName: \"kubernetes.io/projected/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-kube-api-access-nv5bq\") pod \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\" (UID: \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\") " Oct 01 14:00:03 crc kubenswrapper[4851]: I1001 14:00:03.198180 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-config-volume\") pod \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\" (UID: \"ac388880-3f2a-4a6c-868c-3bca0c63e6c3\") " Oct 01 14:00:03 crc kubenswrapper[4851]: I1001 14:00:03.199262 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-config-volume" (OuterVolumeSpecName: "config-volume") pod "ac388880-3f2a-4a6c-868c-3bca0c63e6c3" (UID: "ac388880-3f2a-4a6c-868c-3bca0c63e6c3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:00:03 crc kubenswrapper[4851]: I1001 14:00:03.199760 4851 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:03 crc kubenswrapper[4851]: I1001 14:00:03.206278 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ac388880-3f2a-4a6c-868c-3bca0c63e6c3" (UID: "ac388880-3f2a-4a6c-868c-3bca0c63e6c3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:00:03 crc kubenswrapper[4851]: I1001 14:00:03.206563 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-kube-api-access-nv5bq" (OuterVolumeSpecName: "kube-api-access-nv5bq") pod "ac388880-3f2a-4a6c-868c-3bca0c63e6c3" (UID: "ac388880-3f2a-4a6c-868c-3bca0c63e6c3"). InnerVolumeSpecName "kube-api-access-nv5bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:00:03 crc kubenswrapper[4851]: I1001 14:00:03.302301 4851 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:03 crc kubenswrapper[4851]: I1001 14:00:03.302356 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv5bq\" (UniqueName: \"kubernetes.io/projected/ac388880-3f2a-4a6c-868c-3bca0c63e6c3-kube-api-access-nv5bq\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:03 crc kubenswrapper[4851]: I1001 14:00:03.655979 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" Oct 01 14:00:03 crc kubenswrapper[4851]: I1001 14:00:03.655988 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb" event={"ID":"ac388880-3f2a-4a6c-868c-3bca0c63e6c3","Type":"ContainerDied","Data":"dc6fda5a4adf7ebc1d592d5528ac26ed17551aa6f120d7d79fce5fbb5b011089"} Oct 01 14:00:03 crc kubenswrapper[4851]: I1001 14:00:03.656101 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc6fda5a4adf7ebc1d592d5528ac26ed17551aa6f120d7d79fce5fbb5b011089" Oct 01 14:00:04 crc kubenswrapper[4851]: I1001 14:00:04.166882 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b"] Oct 01 14:00:04 crc kubenswrapper[4851]: I1001 14:00:04.176627 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-r692b"] Oct 01 14:00:04 crc kubenswrapper[4851]: I1001 14:00:04.350898 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a" path="/var/lib/kubelet/pods/39b2bfbd-8498-4fb9-9ad3-91ad2de6f40a/volumes" Oct 01 14:00:07 crc kubenswrapper[4851]: I1001 14:00:07.601472 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 14:00:07 crc kubenswrapper[4851]: I1001 14:00:07.602109 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 14:00:07 crc kubenswrapper[4851]: I1001 14:00:07.688025 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 14:00:07 crc kubenswrapper[4851]: I1001 14:00:07.790192 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 14:00:07 crc kubenswrapper[4851]: I1001 14:00:07.955259 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9spvl"] Oct 01 14:00:09 crc kubenswrapper[4851]: I1001 14:00:09.746775 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9spvl" podUID="0d50126c-0124-4929-967c-523cf31894a5" containerName="registry-server" containerID="cri-o://ef891a100a2a0fe7fc438aa418a9d74fc035e21973f0eb0af11f13a4ce148ee3" gracePeriod=2 Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.316531 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.357808 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d50126c-0124-4929-967c-523cf31894a5-utilities\") pod \"0d50126c-0124-4929-967c-523cf31894a5\" (UID: \"0d50126c-0124-4929-967c-523cf31894a5\") " Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.358013 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn9sg\" (UniqueName: \"kubernetes.io/projected/0d50126c-0124-4929-967c-523cf31894a5-kube-api-access-gn9sg\") pod \"0d50126c-0124-4929-967c-523cf31894a5\" (UID: \"0d50126c-0124-4929-967c-523cf31894a5\") " Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.358059 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d50126c-0124-4929-967c-523cf31894a5-catalog-content\") pod \"0d50126c-0124-4929-967c-523cf31894a5\" (UID: \"0d50126c-0124-4929-967c-523cf31894a5\") " Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.358621 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d50126c-0124-4929-967c-523cf31894a5-utilities" (OuterVolumeSpecName: "utilities") pod "0d50126c-0124-4929-967c-523cf31894a5" (UID: "0d50126c-0124-4929-967c-523cf31894a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.359074 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d50126c-0124-4929-967c-523cf31894a5-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.365193 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d50126c-0124-4929-967c-523cf31894a5-kube-api-access-gn9sg" (OuterVolumeSpecName: "kube-api-access-gn9sg") pod "0d50126c-0124-4929-967c-523cf31894a5" (UID: "0d50126c-0124-4929-967c-523cf31894a5"). InnerVolumeSpecName "kube-api-access-gn9sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.378337 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d50126c-0124-4929-967c-523cf31894a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d50126c-0124-4929-967c-523cf31894a5" (UID: "0d50126c-0124-4929-967c-523cf31894a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.461060 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn9sg\" (UniqueName: \"kubernetes.io/projected/0d50126c-0124-4929-967c-523cf31894a5-kube-api-access-gn9sg\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.461341 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d50126c-0124-4929-967c-523cf31894a5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.763213 4851 generic.go:334] "Generic (PLEG): container finished" podID="0d50126c-0124-4929-967c-523cf31894a5" containerID="ef891a100a2a0fe7fc438aa418a9d74fc035e21973f0eb0af11f13a4ce148ee3" exitCode=0 Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.763294 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9spvl" event={"ID":"0d50126c-0124-4929-967c-523cf31894a5","Type":"ContainerDied","Data":"ef891a100a2a0fe7fc438aa418a9d74fc035e21973f0eb0af11f13a4ce148ee3"} Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.763369 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9spvl" event={"ID":"0d50126c-0124-4929-967c-523cf31894a5","Type":"ContainerDied","Data":"430d1fae608cdc794dcb5d3f0a5112f85aeb7d4aa2c54369b4075c1a1104f50e"} Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.763403 4851 scope.go:117] "RemoveContainer" containerID="ef891a100a2a0fe7fc438aa418a9d74fc035e21973f0eb0af11f13a4ce148ee3" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.765786 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9spvl" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.802892 4851 scope.go:117] "RemoveContainer" containerID="2a3b8504481c3de8d558c2eafa90683ddeb0a056c5b46b5ea7b122dfc28503cd" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.838847 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9spvl"] Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.847412 4851 scope.go:117] "RemoveContainer" containerID="e510964bab2027c8dbf8caaa4d778cb4e258ba42d5ee5f53834aa70d7b0fca11" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.848875 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9spvl"] Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.911853 4851 scope.go:117] "RemoveContainer" containerID="ef891a100a2a0fe7fc438aa418a9d74fc035e21973f0eb0af11f13a4ce148ee3" Oct 01 14:00:10 crc kubenswrapper[4851]: E1001 14:00:10.912408 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef891a100a2a0fe7fc438aa418a9d74fc035e21973f0eb0af11f13a4ce148ee3\": container with ID starting with ef891a100a2a0fe7fc438aa418a9d74fc035e21973f0eb0af11f13a4ce148ee3 not found: ID does not exist" containerID="ef891a100a2a0fe7fc438aa418a9d74fc035e21973f0eb0af11f13a4ce148ee3" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.912454 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef891a100a2a0fe7fc438aa418a9d74fc035e21973f0eb0af11f13a4ce148ee3"} err="failed to get container status \"ef891a100a2a0fe7fc438aa418a9d74fc035e21973f0eb0af11f13a4ce148ee3\": rpc error: code = NotFound desc = could not find container \"ef891a100a2a0fe7fc438aa418a9d74fc035e21973f0eb0af11f13a4ce148ee3\": container with ID starting with ef891a100a2a0fe7fc438aa418a9d74fc035e21973f0eb0af11f13a4ce148ee3 not found: ID does not exist" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.912486 4851 scope.go:117] "RemoveContainer" containerID="2a3b8504481c3de8d558c2eafa90683ddeb0a056c5b46b5ea7b122dfc28503cd" Oct 01 14:00:10 crc kubenswrapper[4851]: E1001 14:00:10.912926 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a3b8504481c3de8d558c2eafa90683ddeb0a056c5b46b5ea7b122dfc28503cd\": container with ID starting with 2a3b8504481c3de8d558c2eafa90683ddeb0a056c5b46b5ea7b122dfc28503cd not found: ID does not exist" containerID="2a3b8504481c3de8d558c2eafa90683ddeb0a056c5b46b5ea7b122dfc28503cd" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.912963 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a3b8504481c3de8d558c2eafa90683ddeb0a056c5b46b5ea7b122dfc28503cd"} err="failed to get container status \"2a3b8504481c3de8d558c2eafa90683ddeb0a056c5b46b5ea7b122dfc28503cd\": rpc error: code = NotFound desc = could not find container \"2a3b8504481c3de8d558c2eafa90683ddeb0a056c5b46b5ea7b122dfc28503cd\": container with ID starting with 2a3b8504481c3de8d558c2eafa90683ddeb0a056c5b46b5ea7b122dfc28503cd not found: ID does not exist" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.912987 4851 scope.go:117] "RemoveContainer" containerID="e510964bab2027c8dbf8caaa4d778cb4e258ba42d5ee5f53834aa70d7b0fca11" Oct 01 14:00:10 crc kubenswrapper[4851]: E1001 14:00:10.913252 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e510964bab2027c8dbf8caaa4d778cb4e258ba42d5ee5f53834aa70d7b0fca11\": container with ID starting with e510964bab2027c8dbf8caaa4d778cb4e258ba42d5ee5f53834aa70d7b0fca11 not found: ID does not exist" containerID="e510964bab2027c8dbf8caaa4d778cb4e258ba42d5ee5f53834aa70d7b0fca11" Oct 01 14:00:10 crc kubenswrapper[4851]: I1001 14:00:10.913288 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e510964bab2027c8dbf8caaa4d778cb4e258ba42d5ee5f53834aa70d7b0fca11"} err="failed to get container status \"e510964bab2027c8dbf8caaa4d778cb4e258ba42d5ee5f53834aa70d7b0fca11\": rpc error: code = NotFound desc = could not find container \"e510964bab2027c8dbf8caaa4d778cb4e258ba42d5ee5f53834aa70d7b0fca11\": container with ID starting with e510964bab2027c8dbf8caaa4d778cb4e258ba42d5ee5f53834aa70d7b0fca11 not found: ID does not exist" Oct 01 14:00:12 crc kubenswrapper[4851]: I1001 14:00:12.344478 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d50126c-0124-4929-967c-523cf31894a5" path="/var/lib/kubelet/pods/0d50126c-0124-4929-967c-523cf31894a5/volumes" Oct 01 14:00:19 crc kubenswrapper[4851]: I1001 14:00:19.411013 4851 scope.go:117] "RemoveContainer" containerID="e688f202087fe49bebaf26601d5e50907c99109fcb302411af563698f0864152" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.191707 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29322121-sgptb"] Oct 01 14:01:00 crc kubenswrapper[4851]: E1001 14:01:00.192887 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d50126c-0124-4929-967c-523cf31894a5" containerName="extract-content" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.192906 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d50126c-0124-4929-967c-523cf31894a5" containerName="extract-content" Oct 01 14:01:00 crc kubenswrapper[4851]: E1001 14:01:00.192928 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d50126c-0124-4929-967c-523cf31894a5" containerName="registry-server" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.192936 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d50126c-0124-4929-967c-523cf31894a5" containerName="registry-server" Oct 01 14:01:00 crc kubenswrapper[4851]: E1001 14:01:00.192960 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac388880-3f2a-4a6c-868c-3bca0c63e6c3" containerName="collect-profiles" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.192970 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac388880-3f2a-4a6c-868c-3bca0c63e6c3" containerName="collect-profiles" Oct 01 14:01:00 crc kubenswrapper[4851]: E1001 14:01:00.192983 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d50126c-0124-4929-967c-523cf31894a5" containerName="extract-utilities" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.192991 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d50126c-0124-4929-967c-523cf31894a5" containerName="extract-utilities" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.193262 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac388880-3f2a-4a6c-868c-3bca0c63e6c3" containerName="collect-profiles" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.193279 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d50126c-0124-4929-967c-523cf31894a5" containerName="registry-server" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.194056 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322121-sgptb"] Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.194146 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.312407 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-config-data\") pod \"keystone-cron-29322121-sgptb\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.312749 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgk6c\" (UniqueName: \"kubernetes.io/projected/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-kube-api-access-tgk6c\") pod \"keystone-cron-29322121-sgptb\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.312777 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-combined-ca-bundle\") pod \"keystone-cron-29322121-sgptb\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.312823 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-fernet-keys\") pod \"keystone-cron-29322121-sgptb\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.415545 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-config-data\") pod \"keystone-cron-29322121-sgptb\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.415707 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgk6c\" (UniqueName: \"kubernetes.io/projected/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-kube-api-access-tgk6c\") pod \"keystone-cron-29322121-sgptb\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.415750 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-combined-ca-bundle\") pod \"keystone-cron-29322121-sgptb\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.415857 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-fernet-keys\") pod \"keystone-cron-29322121-sgptb\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.425471 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-fernet-keys\") pod \"keystone-cron-29322121-sgptb\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.426174 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-config-data\") pod \"keystone-cron-29322121-sgptb\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.427273 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-combined-ca-bundle\") pod \"keystone-cron-29322121-sgptb\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.437701 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgk6c\" (UniqueName: \"kubernetes.io/projected/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-kube-api-access-tgk6c\") pod \"keystone-cron-29322121-sgptb\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:00 crc kubenswrapper[4851]: I1001 14:01:00.543903 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:01 crc kubenswrapper[4851]: I1001 14:01:01.064450 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322121-sgptb"] Oct 01 14:01:01 crc kubenswrapper[4851]: I1001 14:01:01.391885 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322121-sgptb" event={"ID":"22136cf3-22ce-4ebc-b5b6-0b61f25844e8","Type":"ContainerStarted","Data":"eab0f0aaef44fe62b7325f204b60cb78b6d17ee1bbbf3e635a1952ad56fdfcc4"} Oct 01 14:01:01 crc kubenswrapper[4851]: I1001 14:01:01.392378 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322121-sgptb" event={"ID":"22136cf3-22ce-4ebc-b5b6-0b61f25844e8","Type":"ContainerStarted","Data":"a288634adc38afac1d4ef42fb0e0418930c184976ab5c7e010fe6b9203fe1058"} Oct 01 14:01:01 crc kubenswrapper[4851]: I1001 14:01:01.440560 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29322121-sgptb" podStartSLOduration=1.4405298420000001 podStartE2EDuration="1.440529842s" podCreationTimestamp="2025-10-01 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:01:01.421611133 +0000 UTC m=+4069.766728699" watchObservedRunningTime="2025-10-01 14:01:01.440529842 +0000 UTC m=+4069.785647368" Oct 01 14:01:05 crc kubenswrapper[4851]: I1001 14:01:05.431022 4851 generic.go:334] "Generic (PLEG): container finished" podID="22136cf3-22ce-4ebc-b5b6-0b61f25844e8" containerID="eab0f0aaef44fe62b7325f204b60cb78b6d17ee1bbbf3e635a1952ad56fdfcc4" exitCode=0 Oct 01 14:01:05 crc kubenswrapper[4851]: I1001 14:01:05.431222 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322121-sgptb" event={"ID":"22136cf3-22ce-4ebc-b5b6-0b61f25844e8","Type":"ContainerDied","Data":"eab0f0aaef44fe62b7325f204b60cb78b6d17ee1bbbf3e635a1952ad56fdfcc4"} Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.433529 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.455937 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322121-sgptb" event={"ID":"22136cf3-22ce-4ebc-b5b6-0b61f25844e8","Type":"ContainerDied","Data":"a288634adc38afac1d4ef42fb0e0418930c184976ab5c7e010fe6b9203fe1058"} Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.455986 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a288634adc38afac1d4ef42fb0e0418930c184976ab5c7e010fe6b9203fe1058" Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.456039 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322121-sgptb" Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.582850 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-config-data\") pod \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.583160 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-combined-ca-bundle\") pod \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.583212 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-fernet-keys\") pod \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.583313 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgk6c\" (UniqueName: \"kubernetes.io/projected/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-kube-api-access-tgk6c\") pod \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\" (UID: \"22136cf3-22ce-4ebc-b5b6-0b61f25844e8\") " Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.589464 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "22136cf3-22ce-4ebc-b5b6-0b61f25844e8" (UID: "22136cf3-22ce-4ebc-b5b6-0b61f25844e8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.592712 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-kube-api-access-tgk6c" (OuterVolumeSpecName: "kube-api-access-tgk6c") pod "22136cf3-22ce-4ebc-b5b6-0b61f25844e8" (UID: "22136cf3-22ce-4ebc-b5b6-0b61f25844e8"). InnerVolumeSpecName "kube-api-access-tgk6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.622124 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22136cf3-22ce-4ebc-b5b6-0b61f25844e8" (UID: "22136cf3-22ce-4ebc-b5b6-0b61f25844e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.663624 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-config-data" (OuterVolumeSpecName: "config-data") pod "22136cf3-22ce-4ebc-b5b6-0b61f25844e8" (UID: "22136cf3-22ce-4ebc-b5b6-0b61f25844e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.685789 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.685836 4851 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.685857 4851 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:07 crc kubenswrapper[4851]: I1001 14:01:07.685877 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgk6c\" (UniqueName: \"kubernetes.io/projected/22136cf3-22ce-4ebc-b5b6-0b61f25844e8-kube-api-access-tgk6c\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.620951 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4fvkd"] Oct 01 14:01:18 crc kubenswrapper[4851]: E1001 14:01:18.621806 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22136cf3-22ce-4ebc-b5b6-0b61f25844e8" containerName="keystone-cron" Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.621821 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="22136cf3-22ce-4ebc-b5b6-0b61f25844e8" containerName="keystone-cron" Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.622027 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="22136cf3-22ce-4ebc-b5b6-0b61f25844e8" containerName="keystone-cron" Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.623484 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.642750 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4fvkd"] Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.647259 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-catalog-content\") pod \"redhat-operators-4fvkd\" (UID: \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\") " pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.647484 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk5jn\" (UniqueName: \"kubernetes.io/projected/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-kube-api-access-pk5jn\") pod \"redhat-operators-4fvkd\" (UID: \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\") " pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.647583 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-utilities\") pod \"redhat-operators-4fvkd\" (UID: \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\") " pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.749424 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk5jn\" (UniqueName: \"kubernetes.io/projected/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-kube-api-access-pk5jn\") pod \"redhat-operators-4fvkd\" (UID: \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\") " pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.749515 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-utilities\") pod \"redhat-operators-4fvkd\" (UID: \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\") " pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.749609 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-catalog-content\") pod \"redhat-operators-4fvkd\" (UID: \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\") " pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.750334 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-catalog-content\") pod \"redhat-operators-4fvkd\" (UID: \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\") " pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.750336 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-utilities\") pod \"redhat-operators-4fvkd\" (UID: \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\") " pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.778379 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk5jn\" (UniqueName: \"kubernetes.io/projected/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-kube-api-access-pk5jn\") pod \"redhat-operators-4fvkd\" (UID: \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\") " pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:18 crc kubenswrapper[4851]: I1001 14:01:18.951967 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:19 crc kubenswrapper[4851]: I1001 14:01:19.409589 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4fvkd"] Oct 01 14:01:19 crc kubenswrapper[4851]: I1001 14:01:19.635184 4851 generic.go:334] "Generic (PLEG): container finished" podID="e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" containerID="d9f7d1833ca479bb1fd55ea872b2d70ca83557c2071dfb3a7a794b66b5b57f2f" exitCode=0 Oct 01 14:01:19 crc kubenswrapper[4851]: I1001 14:01:19.635231 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fvkd" event={"ID":"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0","Type":"ContainerDied","Data":"d9f7d1833ca479bb1fd55ea872b2d70ca83557c2071dfb3a7a794b66b5b57f2f"} Oct 01 14:01:19 crc kubenswrapper[4851]: I1001 14:01:19.635271 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fvkd" event={"ID":"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0","Type":"ContainerStarted","Data":"1e75b1028781b87ffffb82f7ac1feaeb5d83dd913f3645c50cdc55d205034fb5"} Oct 01 14:01:21 crc kubenswrapper[4851]: I1001 14:01:21.656743 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fvkd" event={"ID":"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0","Type":"ContainerStarted","Data":"6edfd47e15b1782bc4ff756688a4431b48f6e8e76086fe97ef399189d3b97964"} Oct 01 14:01:25 crc kubenswrapper[4851]: I1001 14:01:25.704933 4851 generic.go:334] "Generic (PLEG): container finished" podID="e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" containerID="6edfd47e15b1782bc4ff756688a4431b48f6e8e76086fe97ef399189d3b97964" exitCode=0 Oct 01 14:01:25 crc kubenswrapper[4851]: I1001 14:01:25.705021 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fvkd" event={"ID":"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0","Type":"ContainerDied","Data":"6edfd47e15b1782bc4ff756688a4431b48f6e8e76086fe97ef399189d3b97964"} Oct 01 14:01:26 crc kubenswrapper[4851]: I1001 14:01:26.717787 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fvkd" event={"ID":"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0","Type":"ContainerStarted","Data":"587da31b3ddee80cb291fdc600b79d88464e0c1703d0d3ff85ff5b80073aa74e"} Oct 01 14:01:26 crc kubenswrapper[4851]: I1001 14:01:26.737763 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4fvkd" podStartSLOduration=2.171980544 podStartE2EDuration="8.73774755s" podCreationTimestamp="2025-10-01 14:01:18 +0000 UTC" firstStartedPulling="2025-10-01 14:01:19.6367692 +0000 UTC m=+4087.981886676" lastFinishedPulling="2025-10-01 14:01:26.202536186 +0000 UTC m=+4094.547653682" observedRunningTime="2025-10-01 14:01:26.7324859 +0000 UTC m=+4095.077603386" watchObservedRunningTime="2025-10-01 14:01:26.73774755 +0000 UTC m=+4095.082865036" Oct 01 14:01:28 crc kubenswrapper[4851]: I1001 14:01:28.952958 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:28 crc kubenswrapper[4851]: I1001 14:01:28.953233 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:30 crc kubenswrapper[4851]: I1001 14:01:30.020493 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4fvkd" podUID="e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" containerName="registry-server" probeResult="failure" output=< Oct 01 14:01:30 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 14:01:30 crc kubenswrapper[4851]: > Oct 01 14:01:39 crc kubenswrapper[4851]: I1001 14:01:39.048695 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:39 crc kubenswrapper[4851]: I1001 14:01:39.122568 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:39 crc kubenswrapper[4851]: I1001 14:01:39.328009 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4fvkd"] Oct 01 14:01:40 crc kubenswrapper[4851]: I1001 14:01:40.868453 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4fvkd" podUID="e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" containerName="registry-server" containerID="cri-o://587da31b3ddee80cb291fdc600b79d88464e0c1703d0d3ff85ff5b80073aa74e" gracePeriod=2 Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.390953 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.477338 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-catalog-content\") pod \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\" (UID: \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\") " Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.477444 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk5jn\" (UniqueName: \"kubernetes.io/projected/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-kube-api-access-pk5jn\") pod \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\" (UID: \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\") " Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.477640 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-utilities\") pod \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\" (UID: \"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0\") " Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.480403 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-utilities" (OuterVolumeSpecName: "utilities") pod "e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" (UID: "e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.490805 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-kube-api-access-pk5jn" (OuterVolumeSpecName: "kube-api-access-pk5jn") pod "e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" (UID: "e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0"). InnerVolumeSpecName "kube-api-access-pk5jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.567467 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" (UID: "e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.580016 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.580042 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk5jn\" (UniqueName: \"kubernetes.io/projected/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-kube-api-access-pk5jn\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.580054 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.882997 4851 generic.go:334] "Generic (PLEG): container finished" podID="e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" containerID="587da31b3ddee80cb291fdc600b79d88464e0c1703d0d3ff85ff5b80073aa74e" exitCode=0 Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.883056 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fvkd" event={"ID":"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0","Type":"ContainerDied","Data":"587da31b3ddee80cb291fdc600b79d88464e0c1703d0d3ff85ff5b80073aa74e"} Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.883086 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4fvkd" Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.883108 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fvkd" event={"ID":"e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0","Type":"ContainerDied","Data":"1e75b1028781b87ffffb82f7ac1feaeb5d83dd913f3645c50cdc55d205034fb5"} Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.883140 4851 scope.go:117] "RemoveContainer" containerID="587da31b3ddee80cb291fdc600b79d88464e0c1703d0d3ff85ff5b80073aa74e" Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.924575 4851 scope.go:117] "RemoveContainer" containerID="6edfd47e15b1782bc4ff756688a4431b48f6e8e76086fe97ef399189d3b97964" Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.933088 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4fvkd"] Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.950219 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4fvkd"] Oct 01 14:01:41 crc kubenswrapper[4851]: I1001 14:01:41.958480 4851 scope.go:117] "RemoveContainer" containerID="d9f7d1833ca479bb1fd55ea872b2d70ca83557c2071dfb3a7a794b66b5b57f2f" Oct 01 14:01:42 crc kubenswrapper[4851]: I1001 14:01:42.023988 4851 scope.go:117] "RemoveContainer" containerID="587da31b3ddee80cb291fdc600b79d88464e0c1703d0d3ff85ff5b80073aa74e" Oct 01 14:01:42 crc kubenswrapper[4851]: E1001 14:01:42.024764 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"587da31b3ddee80cb291fdc600b79d88464e0c1703d0d3ff85ff5b80073aa74e\": container with ID starting with 587da31b3ddee80cb291fdc600b79d88464e0c1703d0d3ff85ff5b80073aa74e not found: ID does not exist" containerID="587da31b3ddee80cb291fdc600b79d88464e0c1703d0d3ff85ff5b80073aa74e" Oct 01 14:01:42 crc kubenswrapper[4851]: I1001 14:01:42.024832 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"587da31b3ddee80cb291fdc600b79d88464e0c1703d0d3ff85ff5b80073aa74e"} err="failed to get container status \"587da31b3ddee80cb291fdc600b79d88464e0c1703d0d3ff85ff5b80073aa74e\": rpc error: code = NotFound desc = could not find container \"587da31b3ddee80cb291fdc600b79d88464e0c1703d0d3ff85ff5b80073aa74e\": container with ID starting with 587da31b3ddee80cb291fdc600b79d88464e0c1703d0d3ff85ff5b80073aa74e not found: ID does not exist" Oct 01 14:01:42 crc kubenswrapper[4851]: I1001 14:01:42.024879 4851 scope.go:117] "RemoveContainer" containerID="6edfd47e15b1782bc4ff756688a4431b48f6e8e76086fe97ef399189d3b97964" Oct 01 14:01:42 crc kubenswrapper[4851]: E1001 14:01:42.025697 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6edfd47e15b1782bc4ff756688a4431b48f6e8e76086fe97ef399189d3b97964\": container with ID starting with 6edfd47e15b1782bc4ff756688a4431b48f6e8e76086fe97ef399189d3b97964 not found: ID does not exist" containerID="6edfd47e15b1782bc4ff756688a4431b48f6e8e76086fe97ef399189d3b97964" Oct 01 14:01:42 crc kubenswrapper[4851]: I1001 14:01:42.025741 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6edfd47e15b1782bc4ff756688a4431b48f6e8e76086fe97ef399189d3b97964"} err="failed to get container status \"6edfd47e15b1782bc4ff756688a4431b48f6e8e76086fe97ef399189d3b97964\": rpc error: code = NotFound desc = could not find container \"6edfd47e15b1782bc4ff756688a4431b48f6e8e76086fe97ef399189d3b97964\": container with ID starting with 6edfd47e15b1782bc4ff756688a4431b48f6e8e76086fe97ef399189d3b97964 not found: ID does not exist" Oct 01 14:01:42 crc kubenswrapper[4851]: I1001 14:01:42.025770 4851 scope.go:117] "RemoveContainer" containerID="d9f7d1833ca479bb1fd55ea872b2d70ca83557c2071dfb3a7a794b66b5b57f2f" Oct 01 14:01:42 crc kubenswrapper[4851]: E1001 14:01:42.026194 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9f7d1833ca479bb1fd55ea872b2d70ca83557c2071dfb3a7a794b66b5b57f2f\": container with ID starting with d9f7d1833ca479bb1fd55ea872b2d70ca83557c2071dfb3a7a794b66b5b57f2f not found: ID does not exist" containerID="d9f7d1833ca479bb1fd55ea872b2d70ca83557c2071dfb3a7a794b66b5b57f2f" Oct 01 14:01:42 crc kubenswrapper[4851]: I1001 14:01:42.026236 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9f7d1833ca479bb1fd55ea872b2d70ca83557c2071dfb3a7a794b66b5b57f2f"} err="failed to get container status \"d9f7d1833ca479bb1fd55ea872b2d70ca83557c2071dfb3a7a794b66b5b57f2f\": rpc error: code = NotFound desc = could not find container \"d9f7d1833ca479bb1fd55ea872b2d70ca83557c2071dfb3a7a794b66b5b57f2f\": container with ID starting with d9f7d1833ca479bb1fd55ea872b2d70ca83557c2071dfb3a7a794b66b5b57f2f not found: ID does not exist" Oct 01 14:01:42 crc kubenswrapper[4851]: I1001 14:01:42.347543 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" path="/var/lib/kubelet/pods/e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0/volumes" Oct 01 14:02:00 crc kubenswrapper[4851]: I1001 14:02:00.050281 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:02:00 crc kubenswrapper[4851]: I1001 14:02:00.052375 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:02:30 crc kubenswrapper[4851]: I1001 14:02:30.050661 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:02:30 crc kubenswrapper[4851]: I1001 14:02:30.051315 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:03:00 crc kubenswrapper[4851]: I1001 14:03:00.049970 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:03:00 crc kubenswrapper[4851]: I1001 14:03:00.050623 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:03:00 crc kubenswrapper[4851]: I1001 14:03:00.050671 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 14:03:00 crc kubenswrapper[4851]: I1001 14:03:00.051226 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8fe8ded7a2296c0a9a6846baf026cdf6a0bd0acd4bb0b8ed66d9e9879aebcdb9"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:03:00 crc kubenswrapper[4851]: I1001 14:03:00.051289 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://8fe8ded7a2296c0a9a6846baf026cdf6a0bd0acd4bb0b8ed66d9e9879aebcdb9" gracePeriod=600 Oct 01 14:03:00 crc kubenswrapper[4851]: I1001 14:03:00.835390 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="8fe8ded7a2296c0a9a6846baf026cdf6a0bd0acd4bb0b8ed66d9e9879aebcdb9" exitCode=0 Oct 01 14:03:00 crc kubenswrapper[4851]: I1001 14:03:00.835466 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"8fe8ded7a2296c0a9a6846baf026cdf6a0bd0acd4bb0b8ed66d9e9879aebcdb9"} Oct 01 14:03:00 crc kubenswrapper[4851]: I1001 14:03:00.835991 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290"} Oct 01 14:03:00 crc kubenswrapper[4851]: I1001 14:03:00.836018 4851 scope.go:117] "RemoveContainer" containerID="e6ff4e9129b299a3a18800c27346760f44f4d2715424e9f96a05b0aaddad8dbe" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.230384 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jgpxp"] Oct 01 14:03:35 crc kubenswrapper[4851]: E1001 14:03:35.231348 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" containerName="registry-server" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.231364 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" containerName="registry-server" Oct 01 14:03:35 crc kubenswrapper[4851]: E1001 14:03:35.231399 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" containerName="extract-content" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.231408 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" containerName="extract-content" Oct 01 14:03:35 crc kubenswrapper[4851]: E1001 14:03:35.231453 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" containerName="extract-utilities" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.231463 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" containerName="extract-utilities" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.231745 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0d43c0a-80c7-4ce1-b82c-3ba03e89cab0" containerName="registry-server" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.233690 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.244062 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jgpxp"] Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.430992 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6bd6a7-f815-49a8-870f-26d58e4c7084-catalog-content\") pod \"certified-operators-jgpxp\" (UID: \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\") " pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.431418 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6bd6a7-f815-49a8-870f-26d58e4c7084-utilities\") pod \"certified-operators-jgpxp\" (UID: \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\") " pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.431510 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq75r\" (UniqueName: \"kubernetes.io/projected/9f6bd6a7-f815-49a8-870f-26d58e4c7084-kube-api-access-qq75r\") pod \"certified-operators-jgpxp\" (UID: \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\") " pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.533580 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6bd6a7-f815-49a8-870f-26d58e4c7084-utilities\") pod \"certified-operators-jgpxp\" (UID: \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\") " pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.534042 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6bd6a7-f815-49a8-870f-26d58e4c7084-utilities\") pod \"certified-operators-jgpxp\" (UID: \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\") " pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.534161 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq75r\" (UniqueName: \"kubernetes.io/projected/9f6bd6a7-f815-49a8-870f-26d58e4c7084-kube-api-access-qq75r\") pod \"certified-operators-jgpxp\" (UID: \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\") " pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.535523 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6bd6a7-f815-49a8-870f-26d58e4c7084-catalog-content\") pod \"certified-operators-jgpxp\" (UID: \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\") " pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.535797 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6bd6a7-f815-49a8-870f-26d58e4c7084-catalog-content\") pod \"certified-operators-jgpxp\" (UID: \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\") " pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.558864 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq75r\" (UniqueName: \"kubernetes.io/projected/9f6bd6a7-f815-49a8-870f-26d58e4c7084-kube-api-access-qq75r\") pod \"certified-operators-jgpxp\" (UID: \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\") " pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:35 crc kubenswrapper[4851]: I1001 14:03:35.563852 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:36 crc kubenswrapper[4851]: I1001 14:03:36.152711 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jgpxp"] Oct 01 14:03:36 crc kubenswrapper[4851]: I1001 14:03:36.278065 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgpxp" event={"ID":"9f6bd6a7-f815-49a8-870f-26d58e4c7084","Type":"ContainerStarted","Data":"44c81427ac8ca6f0405ced366a6e5076d427436296c7a4d56fe286d05eee0f2d"} Oct 01 14:03:37 crc kubenswrapper[4851]: I1001 14:03:37.292273 4851 generic.go:334] "Generic (PLEG): container finished" podID="9f6bd6a7-f815-49a8-870f-26d58e4c7084" containerID="c89be710b7c14e5c1053687e4ef21119f01b8a0dceacf377ea5d79b3af46bc1a" exitCode=0 Oct 01 14:03:37 crc kubenswrapper[4851]: I1001 14:03:37.292357 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgpxp" event={"ID":"9f6bd6a7-f815-49a8-870f-26d58e4c7084","Type":"ContainerDied","Data":"c89be710b7c14e5c1053687e4ef21119f01b8a0dceacf377ea5d79b3af46bc1a"} Oct 01 14:03:37 crc kubenswrapper[4851]: I1001 14:03:37.297550 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:03:39 crc kubenswrapper[4851]: I1001 14:03:39.318065 4851 generic.go:334] "Generic (PLEG): container finished" podID="9f6bd6a7-f815-49a8-870f-26d58e4c7084" containerID="aa5d1d45ca485c3afc8cd02fbc6b2a075be606a3231a13ea868cdd78cacf98f1" exitCode=0 Oct 01 14:03:39 crc kubenswrapper[4851]: I1001 14:03:39.318223 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgpxp" event={"ID":"9f6bd6a7-f815-49a8-870f-26d58e4c7084","Type":"ContainerDied","Data":"aa5d1d45ca485c3afc8cd02fbc6b2a075be606a3231a13ea868cdd78cacf98f1"} Oct 01 14:03:40 crc kubenswrapper[4851]: I1001 14:03:40.361040 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgpxp" event={"ID":"9f6bd6a7-f815-49a8-870f-26d58e4c7084","Type":"ContainerStarted","Data":"2baef2a23eb9a3ebebecd9ee3cd09a838a354e959e8ab8cacbbac771f2aa383b"} Oct 01 14:03:40 crc kubenswrapper[4851]: I1001 14:03:40.385819 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jgpxp" podStartSLOduration=2.892852072 podStartE2EDuration="5.385799141s" podCreationTimestamp="2025-10-01 14:03:35 +0000 UTC" firstStartedPulling="2025-10-01 14:03:37.297115981 +0000 UTC m=+4225.642233467" lastFinishedPulling="2025-10-01 14:03:39.79006304 +0000 UTC m=+4228.135180536" observedRunningTime="2025-10-01 14:03:40.363920647 +0000 UTC m=+4228.709038143" watchObservedRunningTime="2025-10-01 14:03:40.385799141 +0000 UTC m=+4228.730916627" Oct 01 14:03:45 crc kubenswrapper[4851]: I1001 14:03:45.565182 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:45 crc kubenswrapper[4851]: I1001 14:03:45.566316 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:46 crc kubenswrapper[4851]: I1001 14:03:46.268243 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:46 crc kubenswrapper[4851]: I1001 14:03:46.491025 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:46 crc kubenswrapper[4851]: I1001 14:03:46.566220 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jgpxp"] Oct 01 14:03:48 crc kubenswrapper[4851]: I1001 14:03:48.438358 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jgpxp" podUID="9f6bd6a7-f815-49a8-870f-26d58e4c7084" containerName="registry-server" containerID="cri-o://2baef2a23eb9a3ebebecd9ee3cd09a838a354e959e8ab8cacbbac771f2aa383b" gracePeriod=2 Oct 01 14:03:48 crc kubenswrapper[4851]: I1001 14:03:48.969002 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.072313 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq75r\" (UniqueName: \"kubernetes.io/projected/9f6bd6a7-f815-49a8-870f-26d58e4c7084-kube-api-access-qq75r\") pod \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\" (UID: \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\") " Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.073172 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6bd6a7-f815-49a8-870f-26d58e4c7084-catalog-content\") pod \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\" (UID: \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\") " Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.073349 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6bd6a7-f815-49a8-870f-26d58e4c7084-utilities\") pod \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\" (UID: \"9f6bd6a7-f815-49a8-870f-26d58e4c7084\") " Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.074248 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f6bd6a7-f815-49a8-870f-26d58e4c7084-utilities" (OuterVolumeSpecName: "utilities") pod "9f6bd6a7-f815-49a8-870f-26d58e4c7084" (UID: "9f6bd6a7-f815-49a8-870f-26d58e4c7084"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.074881 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6bd6a7-f815-49a8-870f-26d58e4c7084-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.084204 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6bd6a7-f815-49a8-870f-26d58e4c7084-kube-api-access-qq75r" (OuterVolumeSpecName: "kube-api-access-qq75r") pod "9f6bd6a7-f815-49a8-870f-26d58e4c7084" (UID: "9f6bd6a7-f815-49a8-870f-26d58e4c7084"). InnerVolumeSpecName "kube-api-access-qq75r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.138185 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f6bd6a7-f815-49a8-870f-26d58e4c7084-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f6bd6a7-f815-49a8-870f-26d58e4c7084" (UID: "9f6bd6a7-f815-49a8-870f-26d58e4c7084"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.176319 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6bd6a7-f815-49a8-870f-26d58e4c7084-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.176359 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq75r\" (UniqueName: \"kubernetes.io/projected/9f6bd6a7-f815-49a8-870f-26d58e4c7084-kube-api-access-qq75r\") on node \"crc\" DevicePath \"\"" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.454176 4851 generic.go:334] "Generic (PLEG): container finished" podID="9f6bd6a7-f815-49a8-870f-26d58e4c7084" containerID="2baef2a23eb9a3ebebecd9ee3cd09a838a354e959e8ab8cacbbac771f2aa383b" exitCode=0 Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.454237 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgpxp" event={"ID":"9f6bd6a7-f815-49a8-870f-26d58e4c7084","Type":"ContainerDied","Data":"2baef2a23eb9a3ebebecd9ee3cd09a838a354e959e8ab8cacbbac771f2aa383b"} Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.454279 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgpxp" event={"ID":"9f6bd6a7-f815-49a8-870f-26d58e4c7084","Type":"ContainerDied","Data":"44c81427ac8ca6f0405ced366a6e5076d427436296c7a4d56fe286d05eee0f2d"} Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.454307 4851 scope.go:117] "RemoveContainer" containerID="2baef2a23eb9a3ebebecd9ee3cd09a838a354e959e8ab8cacbbac771f2aa383b" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.454568 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgpxp" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.494915 4851 scope.go:117] "RemoveContainer" containerID="aa5d1d45ca485c3afc8cd02fbc6b2a075be606a3231a13ea868cdd78cacf98f1" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.516384 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jgpxp"] Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.535674 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jgpxp"] Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.566419 4851 scope.go:117] "RemoveContainer" containerID="c89be710b7c14e5c1053687e4ef21119f01b8a0dceacf377ea5d79b3af46bc1a" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.599809 4851 scope.go:117] "RemoveContainer" containerID="2baef2a23eb9a3ebebecd9ee3cd09a838a354e959e8ab8cacbbac771f2aa383b" Oct 01 14:03:49 crc kubenswrapper[4851]: E1001 14:03:49.600254 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2baef2a23eb9a3ebebecd9ee3cd09a838a354e959e8ab8cacbbac771f2aa383b\": container with ID starting with 2baef2a23eb9a3ebebecd9ee3cd09a838a354e959e8ab8cacbbac771f2aa383b not found: ID does not exist" containerID="2baef2a23eb9a3ebebecd9ee3cd09a838a354e959e8ab8cacbbac771f2aa383b" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.600286 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2baef2a23eb9a3ebebecd9ee3cd09a838a354e959e8ab8cacbbac771f2aa383b"} err="failed to get container status \"2baef2a23eb9a3ebebecd9ee3cd09a838a354e959e8ab8cacbbac771f2aa383b\": rpc error: code = NotFound desc = could not find container \"2baef2a23eb9a3ebebecd9ee3cd09a838a354e959e8ab8cacbbac771f2aa383b\": container with ID starting with 2baef2a23eb9a3ebebecd9ee3cd09a838a354e959e8ab8cacbbac771f2aa383b not found: ID does not exist" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.600315 4851 scope.go:117] "RemoveContainer" containerID="aa5d1d45ca485c3afc8cd02fbc6b2a075be606a3231a13ea868cdd78cacf98f1" Oct 01 14:03:49 crc kubenswrapper[4851]: E1001 14:03:49.600669 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa5d1d45ca485c3afc8cd02fbc6b2a075be606a3231a13ea868cdd78cacf98f1\": container with ID starting with aa5d1d45ca485c3afc8cd02fbc6b2a075be606a3231a13ea868cdd78cacf98f1 not found: ID does not exist" containerID="aa5d1d45ca485c3afc8cd02fbc6b2a075be606a3231a13ea868cdd78cacf98f1" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.600707 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5d1d45ca485c3afc8cd02fbc6b2a075be606a3231a13ea868cdd78cacf98f1"} err="failed to get container status \"aa5d1d45ca485c3afc8cd02fbc6b2a075be606a3231a13ea868cdd78cacf98f1\": rpc error: code = NotFound desc = could not find container \"aa5d1d45ca485c3afc8cd02fbc6b2a075be606a3231a13ea868cdd78cacf98f1\": container with ID starting with aa5d1d45ca485c3afc8cd02fbc6b2a075be606a3231a13ea868cdd78cacf98f1 not found: ID does not exist" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.600734 4851 scope.go:117] "RemoveContainer" containerID="c89be710b7c14e5c1053687e4ef21119f01b8a0dceacf377ea5d79b3af46bc1a" Oct 01 14:03:49 crc kubenswrapper[4851]: E1001 14:03:49.600987 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c89be710b7c14e5c1053687e4ef21119f01b8a0dceacf377ea5d79b3af46bc1a\": container with ID starting with c89be710b7c14e5c1053687e4ef21119f01b8a0dceacf377ea5d79b3af46bc1a not found: ID does not exist" containerID="c89be710b7c14e5c1053687e4ef21119f01b8a0dceacf377ea5d79b3af46bc1a" Oct 01 14:03:49 crc kubenswrapper[4851]: I1001 14:03:49.601017 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89be710b7c14e5c1053687e4ef21119f01b8a0dceacf377ea5d79b3af46bc1a"} err="failed to get container status \"c89be710b7c14e5c1053687e4ef21119f01b8a0dceacf377ea5d79b3af46bc1a\": rpc error: code = NotFound desc = could not find container \"c89be710b7c14e5c1053687e4ef21119f01b8a0dceacf377ea5d79b3af46bc1a\": container with ID starting with c89be710b7c14e5c1053687e4ef21119f01b8a0dceacf377ea5d79b3af46bc1a not found: ID does not exist" Oct 01 14:03:50 crc kubenswrapper[4851]: I1001 14:03:50.348413 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6bd6a7-f815-49a8-870f-26d58e4c7084" path="/var/lib/kubelet/pods/9f6bd6a7-f815-49a8-870f-26d58e4c7084/volumes" Oct 01 14:05:00 crc kubenswrapper[4851]: I1001 14:05:00.053254 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:05:00 crc kubenswrapper[4851]: I1001 14:05:00.053997 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:05:30 crc kubenswrapper[4851]: I1001 14:05:30.050620 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:05:30 crc kubenswrapper[4851]: I1001 14:05:30.051166 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:06:00 crc kubenswrapper[4851]: I1001 14:06:00.050613 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:06:00 crc kubenswrapper[4851]: I1001 14:06:00.051185 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:06:00 crc kubenswrapper[4851]: I1001 14:06:00.051232 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 14:06:00 crc kubenswrapper[4851]: I1001 14:06:00.051999 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:06:00 crc kubenswrapper[4851]: I1001 14:06:00.052045 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" gracePeriod=600 Oct 01 14:06:00 crc kubenswrapper[4851]: E1001 14:06:00.172465 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:06:00 crc kubenswrapper[4851]: I1001 14:06:00.968184 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" exitCode=0 Oct 01 14:06:00 crc kubenswrapper[4851]: I1001 14:06:00.968232 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290"} Oct 01 14:06:00 crc kubenswrapper[4851]: I1001 14:06:00.968270 4851 scope.go:117] "RemoveContainer" containerID="8fe8ded7a2296c0a9a6846baf026cdf6a0bd0acd4bb0b8ed66d9e9879aebcdb9" Oct 01 14:06:00 crc kubenswrapper[4851]: I1001 14:06:00.969030 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:06:00 crc kubenswrapper[4851]: E1001 14:06:00.969393 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:06:14 crc kubenswrapper[4851]: I1001 14:06:14.328421 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:06:14 crc kubenswrapper[4851]: E1001 14:06:14.329086 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:06:29 crc kubenswrapper[4851]: I1001 14:06:29.329124 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:06:29 crc kubenswrapper[4851]: E1001 14:06:29.330588 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:06:41 crc kubenswrapper[4851]: I1001 14:06:41.329188 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:06:41 crc kubenswrapper[4851]: E1001 14:06:41.330360 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:06:52 crc kubenswrapper[4851]: I1001 14:06:52.335676 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:06:52 crc kubenswrapper[4851]: E1001 14:06:52.336617 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:07:03 crc kubenswrapper[4851]: E1001 14:07:03.401065 4851 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.251:38622->38.102.83.251:42733: write tcp 38.102.83.251:38622->38.102.83.251:42733: write: broken pipe Oct 01 14:07:07 crc kubenswrapper[4851]: I1001 14:07:07.329716 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:07:07 crc kubenswrapper[4851]: E1001 14:07:07.330728 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:07:14 crc kubenswrapper[4851]: E1001 14:07:14.095437 4851 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.251:32782->38.102.83.251:42733: write tcp 38.102.83.251:32782->38.102.83.251:42733: write: broken pipe Oct 01 14:07:18 crc kubenswrapper[4851]: E1001 14:07:18.811906 4851 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.251:32886->38.102.83.251:42733: write tcp 38.102.83.251:32886->38.102.83.251:42733: write: connection reset by peer Oct 01 14:07:22 crc kubenswrapper[4851]: I1001 14:07:22.334888 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:07:22 crc kubenswrapper[4851]: E1001 14:07:22.337298 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:07:25 crc kubenswrapper[4851]: I1001 14:07:25.946152 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hbcxf"] Oct 01 14:07:25 crc kubenswrapper[4851]: E1001 14:07:25.947302 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6bd6a7-f815-49a8-870f-26d58e4c7084" containerName="registry-server" Oct 01 14:07:25 crc kubenswrapper[4851]: I1001 14:07:25.947321 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6bd6a7-f815-49a8-870f-26d58e4c7084" containerName="registry-server" Oct 01 14:07:25 crc kubenswrapper[4851]: E1001 14:07:25.947348 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6bd6a7-f815-49a8-870f-26d58e4c7084" containerName="extract-content" Oct 01 14:07:25 crc kubenswrapper[4851]: I1001 14:07:25.947356 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6bd6a7-f815-49a8-870f-26d58e4c7084" containerName="extract-content" Oct 01 14:07:25 crc kubenswrapper[4851]: E1001 14:07:25.947399 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6bd6a7-f815-49a8-870f-26d58e4c7084" containerName="extract-utilities" Oct 01 14:07:25 crc kubenswrapper[4851]: I1001 14:07:25.947408 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6bd6a7-f815-49a8-870f-26d58e4c7084" containerName="extract-utilities" Oct 01 14:07:25 crc kubenswrapper[4851]: I1001 14:07:25.947675 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6bd6a7-f815-49a8-870f-26d58e4c7084" containerName="registry-server" Oct 01 14:07:25 crc kubenswrapper[4851]: I1001 14:07:25.949616 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:25 crc kubenswrapper[4851]: I1001 14:07:25.956838 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hbcxf"] Oct 01 14:07:25 crc kubenswrapper[4851]: I1001 14:07:25.989408 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dzrh\" (UniqueName: \"kubernetes.io/projected/073368fc-7d41-4798-8c10-3df69a6f2b6a-kube-api-access-2dzrh\") pod \"community-operators-hbcxf\" (UID: \"073368fc-7d41-4798-8c10-3df69a6f2b6a\") " pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:25 crc kubenswrapper[4851]: I1001 14:07:25.989519 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/073368fc-7d41-4798-8c10-3df69a6f2b6a-catalog-content\") pod \"community-operators-hbcxf\" (UID: \"073368fc-7d41-4798-8c10-3df69a6f2b6a\") " pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:25 crc kubenswrapper[4851]: I1001 14:07:25.989654 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/073368fc-7d41-4798-8c10-3df69a6f2b6a-utilities\") pod \"community-operators-hbcxf\" (UID: \"073368fc-7d41-4798-8c10-3df69a6f2b6a\") " pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:26 crc kubenswrapper[4851]: I1001 14:07:26.091693 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/073368fc-7d41-4798-8c10-3df69a6f2b6a-catalog-content\") pod \"community-operators-hbcxf\" (UID: \"073368fc-7d41-4798-8c10-3df69a6f2b6a\") " pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:26 crc kubenswrapper[4851]: I1001 14:07:26.091875 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/073368fc-7d41-4798-8c10-3df69a6f2b6a-utilities\") pod \"community-operators-hbcxf\" (UID: \"073368fc-7d41-4798-8c10-3df69a6f2b6a\") " pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:26 crc kubenswrapper[4851]: I1001 14:07:26.091958 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dzrh\" (UniqueName: \"kubernetes.io/projected/073368fc-7d41-4798-8c10-3df69a6f2b6a-kube-api-access-2dzrh\") pod \"community-operators-hbcxf\" (UID: \"073368fc-7d41-4798-8c10-3df69a6f2b6a\") " pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:26 crc kubenswrapper[4851]: I1001 14:07:26.092248 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/073368fc-7d41-4798-8c10-3df69a6f2b6a-catalog-content\") pod \"community-operators-hbcxf\" (UID: \"073368fc-7d41-4798-8c10-3df69a6f2b6a\") " pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:26 crc kubenswrapper[4851]: I1001 14:07:26.092565 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/073368fc-7d41-4798-8c10-3df69a6f2b6a-utilities\") pod \"community-operators-hbcxf\" (UID: \"073368fc-7d41-4798-8c10-3df69a6f2b6a\") " pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:26 crc kubenswrapper[4851]: I1001 14:07:26.158786 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dzrh\" (UniqueName: \"kubernetes.io/projected/073368fc-7d41-4798-8c10-3df69a6f2b6a-kube-api-access-2dzrh\") pod \"community-operators-hbcxf\" (UID: \"073368fc-7d41-4798-8c10-3df69a6f2b6a\") " pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:26 crc kubenswrapper[4851]: I1001 14:07:26.288685 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:26 crc kubenswrapper[4851]: I1001 14:07:26.839522 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hbcxf"] Oct 01 14:07:26 crc kubenswrapper[4851]: I1001 14:07:26.932884 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbcxf" event={"ID":"073368fc-7d41-4798-8c10-3df69a6f2b6a","Type":"ContainerStarted","Data":"0c3724688b23fb384a51c44051aa233de5038a1aa75ffa197b6624e9a305c2db"} Oct 01 14:07:27 crc kubenswrapper[4851]: I1001 14:07:27.944034 4851 generic.go:334] "Generic (PLEG): container finished" podID="073368fc-7d41-4798-8c10-3df69a6f2b6a" containerID="29599150255f0740df4f3e802c0d6321ef71b45d66db25ebf8fecbab1ba083e4" exitCode=0 Oct 01 14:07:27 crc kubenswrapper[4851]: I1001 14:07:27.944087 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbcxf" event={"ID":"073368fc-7d41-4798-8c10-3df69a6f2b6a","Type":"ContainerDied","Data":"29599150255f0740df4f3e802c0d6321ef71b45d66db25ebf8fecbab1ba083e4"} Oct 01 14:07:29 crc kubenswrapper[4851]: I1001 14:07:29.973465 4851 generic.go:334] "Generic (PLEG): container finished" podID="073368fc-7d41-4798-8c10-3df69a6f2b6a" containerID="ffe0d214b0ea8d423640af19677be23eed2c6a3bd94def3b9ad3472c31a8fe17" exitCode=0 Oct 01 14:07:29 crc kubenswrapper[4851]: I1001 14:07:29.973593 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbcxf" event={"ID":"073368fc-7d41-4798-8c10-3df69a6f2b6a","Type":"ContainerDied","Data":"ffe0d214b0ea8d423640af19677be23eed2c6a3bd94def3b9ad3472c31a8fe17"} Oct 01 14:07:30 crc kubenswrapper[4851]: I1001 14:07:30.987113 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbcxf" event={"ID":"073368fc-7d41-4798-8c10-3df69a6f2b6a","Type":"ContainerStarted","Data":"95c86b5c4f4a86bd48a7a989941e7a970c550c32fd56140a7e620ae8c5030903"} Oct 01 14:07:31 crc kubenswrapper[4851]: I1001 14:07:31.022014 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hbcxf" podStartSLOduration=3.385710322 podStartE2EDuration="6.021979248s" podCreationTimestamp="2025-10-01 14:07:25 +0000 UTC" firstStartedPulling="2025-10-01 14:07:27.946799924 +0000 UTC m=+4456.291917430" lastFinishedPulling="2025-10-01 14:07:30.58306883 +0000 UTC m=+4458.928186356" observedRunningTime="2025-10-01 14:07:31.007142145 +0000 UTC m=+4459.352259631" watchObservedRunningTime="2025-10-01 14:07:31.021979248 +0000 UTC m=+4459.367096764" Oct 01 14:07:36 crc kubenswrapper[4851]: I1001 14:07:36.289235 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:36 crc kubenswrapper[4851]: I1001 14:07:36.289837 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:36 crc kubenswrapper[4851]: I1001 14:07:36.474709 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:37 crc kubenswrapper[4851]: I1001 14:07:37.144346 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:37 crc kubenswrapper[4851]: I1001 14:07:37.229026 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hbcxf"] Oct 01 14:07:37 crc kubenswrapper[4851]: I1001 14:07:37.328423 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:07:37 crc kubenswrapper[4851]: E1001 14:07:37.328887 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:07:39 crc kubenswrapper[4851]: I1001 14:07:39.079208 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hbcxf" podUID="073368fc-7d41-4798-8c10-3df69a6f2b6a" containerName="registry-server" containerID="cri-o://95c86b5c4f4a86bd48a7a989941e7a970c550c32fd56140a7e620ae8c5030903" gracePeriod=2 Oct 01 14:07:39 crc kubenswrapper[4851]: I1001 14:07:39.628428 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:39 crc kubenswrapper[4851]: I1001 14:07:39.785278 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dzrh\" (UniqueName: \"kubernetes.io/projected/073368fc-7d41-4798-8c10-3df69a6f2b6a-kube-api-access-2dzrh\") pod \"073368fc-7d41-4798-8c10-3df69a6f2b6a\" (UID: \"073368fc-7d41-4798-8c10-3df69a6f2b6a\") " Oct 01 14:07:39 crc kubenswrapper[4851]: I1001 14:07:39.785588 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/073368fc-7d41-4798-8c10-3df69a6f2b6a-utilities\") pod \"073368fc-7d41-4798-8c10-3df69a6f2b6a\" (UID: \"073368fc-7d41-4798-8c10-3df69a6f2b6a\") " Oct 01 14:07:39 crc kubenswrapper[4851]: I1001 14:07:39.785623 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/073368fc-7d41-4798-8c10-3df69a6f2b6a-catalog-content\") pod \"073368fc-7d41-4798-8c10-3df69a6f2b6a\" (UID: \"073368fc-7d41-4798-8c10-3df69a6f2b6a\") " Oct 01 14:07:39 crc kubenswrapper[4851]: I1001 14:07:39.787176 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/073368fc-7d41-4798-8c10-3df69a6f2b6a-utilities" (OuterVolumeSpecName: "utilities") pod "073368fc-7d41-4798-8c10-3df69a6f2b6a" (UID: "073368fc-7d41-4798-8c10-3df69a6f2b6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:07:39 crc kubenswrapper[4851]: I1001 14:07:39.796739 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073368fc-7d41-4798-8c10-3df69a6f2b6a-kube-api-access-2dzrh" (OuterVolumeSpecName: "kube-api-access-2dzrh") pod "073368fc-7d41-4798-8c10-3df69a6f2b6a" (UID: "073368fc-7d41-4798-8c10-3df69a6f2b6a"). InnerVolumeSpecName "kube-api-access-2dzrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:07:39 crc kubenswrapper[4851]: I1001 14:07:39.887906 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dzrh\" (UniqueName: \"kubernetes.io/projected/073368fc-7d41-4798-8c10-3df69a6f2b6a-kube-api-access-2dzrh\") on node \"crc\" DevicePath \"\"" Oct 01 14:07:39 crc kubenswrapper[4851]: I1001 14:07:39.887944 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/073368fc-7d41-4798-8c10-3df69a6f2b6a-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.093583 4851 generic.go:334] "Generic (PLEG): container finished" podID="073368fc-7d41-4798-8c10-3df69a6f2b6a" containerID="95c86b5c4f4a86bd48a7a989941e7a970c550c32fd56140a7e620ae8c5030903" exitCode=0 Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.093648 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbcxf" event={"ID":"073368fc-7d41-4798-8c10-3df69a6f2b6a","Type":"ContainerDied","Data":"95c86b5c4f4a86bd48a7a989941e7a970c550c32fd56140a7e620ae8c5030903"} Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.093677 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbcxf" Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.093726 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbcxf" event={"ID":"073368fc-7d41-4798-8c10-3df69a6f2b6a","Type":"ContainerDied","Data":"0c3724688b23fb384a51c44051aa233de5038a1aa75ffa197b6624e9a305c2db"} Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.093759 4851 scope.go:117] "RemoveContainer" containerID="95c86b5c4f4a86bd48a7a989941e7a970c550c32fd56140a7e620ae8c5030903" Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.120529 4851 scope.go:117] "RemoveContainer" containerID="ffe0d214b0ea8d423640af19677be23eed2c6a3bd94def3b9ad3472c31a8fe17" Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.162583 4851 scope.go:117] "RemoveContainer" containerID="29599150255f0740df4f3e802c0d6321ef71b45d66db25ebf8fecbab1ba083e4" Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.221337 4851 scope.go:117] "RemoveContainer" containerID="95c86b5c4f4a86bd48a7a989941e7a970c550c32fd56140a7e620ae8c5030903" Oct 01 14:07:40 crc kubenswrapper[4851]: E1001 14:07:40.222092 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c86b5c4f4a86bd48a7a989941e7a970c550c32fd56140a7e620ae8c5030903\": container with ID starting with 95c86b5c4f4a86bd48a7a989941e7a970c550c32fd56140a7e620ae8c5030903 not found: ID does not exist" containerID="95c86b5c4f4a86bd48a7a989941e7a970c550c32fd56140a7e620ae8c5030903" Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.222131 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c86b5c4f4a86bd48a7a989941e7a970c550c32fd56140a7e620ae8c5030903"} err="failed to get container status \"95c86b5c4f4a86bd48a7a989941e7a970c550c32fd56140a7e620ae8c5030903\": rpc error: code = NotFound desc = could not find container \"95c86b5c4f4a86bd48a7a989941e7a970c550c32fd56140a7e620ae8c5030903\": container with ID starting with 95c86b5c4f4a86bd48a7a989941e7a970c550c32fd56140a7e620ae8c5030903 not found: ID does not exist" Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.222158 4851 scope.go:117] "RemoveContainer" containerID="ffe0d214b0ea8d423640af19677be23eed2c6a3bd94def3b9ad3472c31a8fe17" Oct 01 14:07:40 crc kubenswrapper[4851]: E1001 14:07:40.222381 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe0d214b0ea8d423640af19677be23eed2c6a3bd94def3b9ad3472c31a8fe17\": container with ID starting with ffe0d214b0ea8d423640af19677be23eed2c6a3bd94def3b9ad3472c31a8fe17 not found: ID does not exist" containerID="ffe0d214b0ea8d423640af19677be23eed2c6a3bd94def3b9ad3472c31a8fe17" Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.222407 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe0d214b0ea8d423640af19677be23eed2c6a3bd94def3b9ad3472c31a8fe17"} err="failed to get container status \"ffe0d214b0ea8d423640af19677be23eed2c6a3bd94def3b9ad3472c31a8fe17\": rpc error: code = NotFound desc = could not find container \"ffe0d214b0ea8d423640af19677be23eed2c6a3bd94def3b9ad3472c31a8fe17\": container with ID starting with ffe0d214b0ea8d423640af19677be23eed2c6a3bd94def3b9ad3472c31a8fe17 not found: ID does not exist" Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.222426 4851 scope.go:117] "RemoveContainer" containerID="29599150255f0740df4f3e802c0d6321ef71b45d66db25ebf8fecbab1ba083e4" Oct 01 14:07:40 crc kubenswrapper[4851]: E1001 14:07:40.227935 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29599150255f0740df4f3e802c0d6321ef71b45d66db25ebf8fecbab1ba083e4\": container with ID starting with 29599150255f0740df4f3e802c0d6321ef71b45d66db25ebf8fecbab1ba083e4 not found: ID does not exist" containerID="29599150255f0740df4f3e802c0d6321ef71b45d66db25ebf8fecbab1ba083e4" Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.227968 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29599150255f0740df4f3e802c0d6321ef71b45d66db25ebf8fecbab1ba083e4"} err="failed to get container status \"29599150255f0740df4f3e802c0d6321ef71b45d66db25ebf8fecbab1ba083e4\": rpc error: code = NotFound desc = could not find container \"29599150255f0740df4f3e802c0d6321ef71b45d66db25ebf8fecbab1ba083e4\": container with ID starting with 29599150255f0740df4f3e802c0d6321ef71b45d66db25ebf8fecbab1ba083e4 not found: ID does not exist" Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.229571 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/073368fc-7d41-4798-8c10-3df69a6f2b6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "073368fc-7d41-4798-8c10-3df69a6f2b6a" (UID: "073368fc-7d41-4798-8c10-3df69a6f2b6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.299561 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/073368fc-7d41-4798-8c10-3df69a6f2b6a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.436557 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hbcxf"] Oct 01 14:07:40 crc kubenswrapper[4851]: I1001 14:07:40.444603 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hbcxf"] Oct 01 14:07:42 crc kubenswrapper[4851]: I1001 14:07:42.353264 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="073368fc-7d41-4798-8c10-3df69a6f2b6a" path="/var/lib/kubelet/pods/073368fc-7d41-4798-8c10-3df69a6f2b6a/volumes" Oct 01 14:07:50 crc kubenswrapper[4851]: I1001 14:07:50.328868 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:07:50 crc kubenswrapper[4851]: E1001 14:07:50.329767 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:08:02 crc kubenswrapper[4851]: I1001 14:08:02.344581 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:08:02 crc kubenswrapper[4851]: E1001 14:08:02.346781 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:08:14 crc kubenswrapper[4851]: I1001 14:08:14.328872 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:08:14 crc kubenswrapper[4851]: E1001 14:08:14.329888 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:08:29 crc kubenswrapper[4851]: I1001 14:08:29.328712 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:08:29 crc kubenswrapper[4851]: E1001 14:08:29.329550 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:08:41 crc kubenswrapper[4851]: I1001 14:08:41.329283 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:08:41 crc kubenswrapper[4851]: E1001 14:08:41.330434 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:08:56 crc kubenswrapper[4851]: I1001 14:08:56.328133 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:08:56 crc kubenswrapper[4851]: E1001 14:08:56.328907 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:09:09 crc kubenswrapper[4851]: I1001 14:09:09.328982 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:09:09 crc kubenswrapper[4851]: E1001 14:09:09.330088 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:09:24 crc kubenswrapper[4851]: I1001 14:09:24.328748 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:09:24 crc kubenswrapper[4851]: E1001 14:09:24.330111 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:09:38 crc kubenswrapper[4851]: I1001 14:09:38.328784 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:09:38 crc kubenswrapper[4851]: E1001 14:09:38.330814 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:09:49 crc kubenswrapper[4851]: I1001 14:09:49.328800 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:09:49 crc kubenswrapper[4851]: E1001 14:09:49.329808 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:10:03 crc kubenswrapper[4851]: I1001 14:10:03.328013 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:10:03 crc kubenswrapper[4851]: E1001 14:10:03.328997 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:10:18 crc kubenswrapper[4851]: I1001 14:10:18.328538 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:10:18 crc kubenswrapper[4851]: E1001 14:10:18.329473 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:10:30 crc kubenswrapper[4851]: I1001 14:10:30.328744 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:10:30 crc kubenswrapper[4851]: E1001 14:10:30.329872 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:10:43 crc kubenswrapper[4851]: I1001 14:10:43.329245 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:10:43 crc kubenswrapper[4851]: E1001 14:10:43.330159 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:10:57 crc kubenswrapper[4851]: I1001 14:10:57.329076 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:10:57 crc kubenswrapper[4851]: E1001 14:10:57.330384 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:11:10 crc kubenswrapper[4851]: I1001 14:11:10.329254 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:11:10 crc kubenswrapper[4851]: I1001 14:11:10.652697 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"3bedab356653f0132308f759754cf021851ccb18415c5aa7a1c2d58cad13e034"} Oct 01 14:12:17 crc kubenswrapper[4851]: I1001 14:12:17.993011 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5sg2z"] Oct 01 14:12:17 crc kubenswrapper[4851]: E1001 14:12:17.994226 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073368fc-7d41-4798-8c10-3df69a6f2b6a" containerName="registry-server" Oct 01 14:12:17 crc kubenswrapper[4851]: I1001 14:12:17.994248 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="073368fc-7d41-4798-8c10-3df69a6f2b6a" containerName="registry-server" Oct 01 14:12:17 crc kubenswrapper[4851]: E1001 14:12:17.994308 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073368fc-7d41-4798-8c10-3df69a6f2b6a" containerName="extract-utilities" Oct 01 14:12:17 crc kubenswrapper[4851]: I1001 14:12:17.994318 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="073368fc-7d41-4798-8c10-3df69a6f2b6a" containerName="extract-utilities" Oct 01 14:12:17 crc kubenswrapper[4851]: E1001 14:12:17.994351 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073368fc-7d41-4798-8c10-3df69a6f2b6a" containerName="extract-content" Oct 01 14:12:17 crc kubenswrapper[4851]: I1001 14:12:17.994360 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="073368fc-7d41-4798-8c10-3df69a6f2b6a" containerName="extract-content" Oct 01 14:12:17 crc kubenswrapper[4851]: I1001 14:12:17.994684 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="073368fc-7d41-4798-8c10-3df69a6f2b6a" containerName="registry-server" Oct 01 14:12:17 crc kubenswrapper[4851]: I1001 14:12:17.996597 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:18 crc kubenswrapper[4851]: I1001 14:12:18.011306 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5sg2z"] Oct 01 14:12:18 crc kubenswrapper[4851]: I1001 14:12:18.118180 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7grf9\" (UniqueName: \"kubernetes.io/projected/3a19425f-e0da-4fd3-9d48-d811f1199303-kube-api-access-7grf9\") pod \"redhat-operators-5sg2z\" (UID: \"3a19425f-e0da-4fd3-9d48-d811f1199303\") " pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:18 crc kubenswrapper[4851]: I1001 14:12:18.118613 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a19425f-e0da-4fd3-9d48-d811f1199303-catalog-content\") pod \"redhat-operators-5sg2z\" (UID: \"3a19425f-e0da-4fd3-9d48-d811f1199303\") " pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:18 crc kubenswrapper[4851]: I1001 14:12:18.118666 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a19425f-e0da-4fd3-9d48-d811f1199303-utilities\") pod \"redhat-operators-5sg2z\" (UID: \"3a19425f-e0da-4fd3-9d48-d811f1199303\") " pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:18 crc kubenswrapper[4851]: I1001 14:12:18.220600 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7grf9\" (UniqueName: \"kubernetes.io/projected/3a19425f-e0da-4fd3-9d48-d811f1199303-kube-api-access-7grf9\") pod \"redhat-operators-5sg2z\" (UID: \"3a19425f-e0da-4fd3-9d48-d811f1199303\") " pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:18 crc kubenswrapper[4851]: I1001 14:12:18.220676 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a19425f-e0da-4fd3-9d48-d811f1199303-catalog-content\") pod \"redhat-operators-5sg2z\" (UID: \"3a19425f-e0da-4fd3-9d48-d811f1199303\") " pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:18 crc kubenswrapper[4851]: I1001 14:12:18.220739 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a19425f-e0da-4fd3-9d48-d811f1199303-utilities\") pod \"redhat-operators-5sg2z\" (UID: \"3a19425f-e0da-4fd3-9d48-d811f1199303\") " pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:18 crc kubenswrapper[4851]: I1001 14:12:18.221365 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a19425f-e0da-4fd3-9d48-d811f1199303-catalog-content\") pod \"redhat-operators-5sg2z\" (UID: \"3a19425f-e0da-4fd3-9d48-d811f1199303\") " pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:18 crc kubenswrapper[4851]: I1001 14:12:18.221394 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a19425f-e0da-4fd3-9d48-d811f1199303-utilities\") pod \"redhat-operators-5sg2z\" (UID: \"3a19425f-e0da-4fd3-9d48-d811f1199303\") " pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:18 crc kubenswrapper[4851]: I1001 14:12:18.793022 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7grf9\" (UniqueName: \"kubernetes.io/projected/3a19425f-e0da-4fd3-9d48-d811f1199303-kube-api-access-7grf9\") pod \"redhat-operators-5sg2z\" (UID: \"3a19425f-e0da-4fd3-9d48-d811f1199303\") " pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:18 crc kubenswrapper[4851]: I1001 14:12:18.924129 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:19 crc kubenswrapper[4851]: I1001 14:12:19.487537 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5sg2z"] Oct 01 14:12:20 crc kubenswrapper[4851]: I1001 14:12:20.472947 4851 generic.go:334] "Generic (PLEG): container finished" podID="3a19425f-e0da-4fd3-9d48-d811f1199303" containerID="83169421e7924ccf873b12c86a2d165003ef730c7f5967fcf165bbe50a05f783" exitCode=0 Oct 01 14:12:20 crc kubenswrapper[4851]: I1001 14:12:20.473069 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sg2z" event={"ID":"3a19425f-e0da-4fd3-9d48-d811f1199303","Type":"ContainerDied","Data":"83169421e7924ccf873b12c86a2d165003ef730c7f5967fcf165bbe50a05f783"} Oct 01 14:12:20 crc kubenswrapper[4851]: I1001 14:12:20.473242 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sg2z" event={"ID":"3a19425f-e0da-4fd3-9d48-d811f1199303","Type":"ContainerStarted","Data":"341a59276c5e39b2e0d9be5d294a95dd47debe436169b4d95c88e011d2157c80"} Oct 01 14:12:20 crc kubenswrapper[4851]: I1001 14:12:20.475937 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:12:22 crc kubenswrapper[4851]: I1001 14:12:22.501814 4851 generic.go:334] "Generic (PLEG): container finished" podID="3a19425f-e0da-4fd3-9d48-d811f1199303" containerID="c19754fd0ca13d7356b97d37062277be84e46c8e55806f124a31d7ded4be4c43" exitCode=0 Oct 01 14:12:22 crc kubenswrapper[4851]: I1001 14:12:22.501872 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sg2z" event={"ID":"3a19425f-e0da-4fd3-9d48-d811f1199303","Type":"ContainerDied","Data":"c19754fd0ca13d7356b97d37062277be84e46c8e55806f124a31d7ded4be4c43"} Oct 01 14:12:25 crc kubenswrapper[4851]: I1001 14:12:25.558040 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sg2z" event={"ID":"3a19425f-e0da-4fd3-9d48-d811f1199303","Type":"ContainerStarted","Data":"9706745c56ba1977ac19383b55969287b3ee416b30f95b7a8d447aeecff737bc"} Oct 01 14:12:28 crc kubenswrapper[4851]: I1001 14:12:28.925139 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:28 crc kubenswrapper[4851]: I1001 14:12:28.925869 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:30 crc kubenswrapper[4851]: I1001 14:12:30.019377 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5sg2z" podUID="3a19425f-e0da-4fd3-9d48-d811f1199303" containerName="registry-server" probeResult="failure" output=< Oct 01 14:12:30 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 14:12:30 crc kubenswrapper[4851]: > Oct 01 14:12:39 crc kubenswrapper[4851]: I1001 14:12:39.016682 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:39 crc kubenswrapper[4851]: I1001 14:12:39.066240 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5sg2z" podStartSLOduration=19.370638011 podStartE2EDuration="22.06621242s" podCreationTimestamp="2025-10-01 14:12:17 +0000 UTC" firstStartedPulling="2025-10-01 14:12:20.475526603 +0000 UTC m=+4748.820644129" lastFinishedPulling="2025-10-01 14:12:23.171101042 +0000 UTC m=+4751.516218538" observedRunningTime="2025-10-01 14:12:25.580414795 +0000 UTC m=+4753.925532291" watchObservedRunningTime="2025-10-01 14:12:39.06621242 +0000 UTC m=+4767.411329936" Oct 01 14:12:39 crc kubenswrapper[4851]: I1001 14:12:39.111698 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:39 crc kubenswrapper[4851]: I1001 14:12:39.267857 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5sg2z"] Oct 01 14:12:40 crc kubenswrapper[4851]: I1001 14:12:40.743409 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5sg2z" podUID="3a19425f-e0da-4fd3-9d48-d811f1199303" containerName="registry-server" containerID="cri-o://9706745c56ba1977ac19383b55969287b3ee416b30f95b7a8d447aeecff737bc" gracePeriod=2 Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.227101 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.338427 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a19425f-e0da-4fd3-9d48-d811f1199303-utilities\") pod \"3a19425f-e0da-4fd3-9d48-d811f1199303\" (UID: \"3a19425f-e0da-4fd3-9d48-d811f1199303\") " Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.338601 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7grf9\" (UniqueName: \"kubernetes.io/projected/3a19425f-e0da-4fd3-9d48-d811f1199303-kube-api-access-7grf9\") pod \"3a19425f-e0da-4fd3-9d48-d811f1199303\" (UID: \"3a19425f-e0da-4fd3-9d48-d811f1199303\") " Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.338657 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a19425f-e0da-4fd3-9d48-d811f1199303-catalog-content\") pod \"3a19425f-e0da-4fd3-9d48-d811f1199303\" (UID: \"3a19425f-e0da-4fd3-9d48-d811f1199303\") " Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.339597 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a19425f-e0da-4fd3-9d48-d811f1199303-utilities" (OuterVolumeSpecName: "utilities") pod "3a19425f-e0da-4fd3-9d48-d811f1199303" (UID: "3a19425f-e0da-4fd3-9d48-d811f1199303"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.353947 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a19425f-e0da-4fd3-9d48-d811f1199303-kube-api-access-7grf9" (OuterVolumeSpecName: "kube-api-access-7grf9") pod "3a19425f-e0da-4fd3-9d48-d811f1199303" (UID: "3a19425f-e0da-4fd3-9d48-d811f1199303"). InnerVolumeSpecName "kube-api-access-7grf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.442697 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a19425f-e0da-4fd3-9d48-d811f1199303-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.443023 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7grf9\" (UniqueName: \"kubernetes.io/projected/3a19425f-e0da-4fd3-9d48-d811f1199303-kube-api-access-7grf9\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.445204 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a19425f-e0da-4fd3-9d48-d811f1199303-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a19425f-e0da-4fd3-9d48-d811f1199303" (UID: "3a19425f-e0da-4fd3-9d48-d811f1199303"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.545344 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a19425f-e0da-4fd3-9d48-d811f1199303-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.762034 4851 generic.go:334] "Generic (PLEG): container finished" podID="3a19425f-e0da-4fd3-9d48-d811f1199303" containerID="9706745c56ba1977ac19383b55969287b3ee416b30f95b7a8d447aeecff737bc" exitCode=0 Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.762092 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sg2z" event={"ID":"3a19425f-e0da-4fd3-9d48-d811f1199303","Type":"ContainerDied","Data":"9706745c56ba1977ac19383b55969287b3ee416b30f95b7a8d447aeecff737bc"} Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.762212 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sg2z" event={"ID":"3a19425f-e0da-4fd3-9d48-d811f1199303","Type":"ContainerDied","Data":"341a59276c5e39b2e0d9be5d294a95dd47debe436169b4d95c88e011d2157c80"} Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.762262 4851 scope.go:117] "RemoveContainer" containerID="9706745c56ba1977ac19383b55969287b3ee416b30f95b7a8d447aeecff737bc" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.762133 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5sg2z" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.821451 4851 scope.go:117] "RemoveContainer" containerID="c19754fd0ca13d7356b97d37062277be84e46c8e55806f124a31d7ded4be4c43" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.831545 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5sg2z"] Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.846746 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5sg2z"] Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.864042 4851 scope.go:117] "RemoveContainer" containerID="83169421e7924ccf873b12c86a2d165003ef730c7f5967fcf165bbe50a05f783" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.924011 4851 scope.go:117] "RemoveContainer" containerID="9706745c56ba1977ac19383b55969287b3ee416b30f95b7a8d447aeecff737bc" Oct 01 14:12:41 crc kubenswrapper[4851]: E1001 14:12:41.924921 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9706745c56ba1977ac19383b55969287b3ee416b30f95b7a8d447aeecff737bc\": container with ID starting with 9706745c56ba1977ac19383b55969287b3ee416b30f95b7a8d447aeecff737bc not found: ID does not exist" containerID="9706745c56ba1977ac19383b55969287b3ee416b30f95b7a8d447aeecff737bc" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.924977 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9706745c56ba1977ac19383b55969287b3ee416b30f95b7a8d447aeecff737bc"} err="failed to get container status \"9706745c56ba1977ac19383b55969287b3ee416b30f95b7a8d447aeecff737bc\": rpc error: code = NotFound desc = could not find container \"9706745c56ba1977ac19383b55969287b3ee416b30f95b7a8d447aeecff737bc\": container with ID starting with 9706745c56ba1977ac19383b55969287b3ee416b30f95b7a8d447aeecff737bc not found: ID does not exist" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.925014 4851 scope.go:117] "RemoveContainer" containerID="c19754fd0ca13d7356b97d37062277be84e46c8e55806f124a31d7ded4be4c43" Oct 01 14:12:41 crc kubenswrapper[4851]: E1001 14:12:41.925405 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c19754fd0ca13d7356b97d37062277be84e46c8e55806f124a31d7ded4be4c43\": container with ID starting with c19754fd0ca13d7356b97d37062277be84e46c8e55806f124a31d7ded4be4c43 not found: ID does not exist" containerID="c19754fd0ca13d7356b97d37062277be84e46c8e55806f124a31d7ded4be4c43" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.925596 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c19754fd0ca13d7356b97d37062277be84e46c8e55806f124a31d7ded4be4c43"} err="failed to get container status \"c19754fd0ca13d7356b97d37062277be84e46c8e55806f124a31d7ded4be4c43\": rpc error: code = NotFound desc = could not find container \"c19754fd0ca13d7356b97d37062277be84e46c8e55806f124a31d7ded4be4c43\": container with ID starting with c19754fd0ca13d7356b97d37062277be84e46c8e55806f124a31d7ded4be4c43 not found: ID does not exist" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.925717 4851 scope.go:117] "RemoveContainer" containerID="83169421e7924ccf873b12c86a2d165003ef730c7f5967fcf165bbe50a05f783" Oct 01 14:12:41 crc kubenswrapper[4851]: E1001 14:12:41.926380 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83169421e7924ccf873b12c86a2d165003ef730c7f5967fcf165bbe50a05f783\": container with ID starting with 83169421e7924ccf873b12c86a2d165003ef730c7f5967fcf165bbe50a05f783 not found: ID does not exist" containerID="83169421e7924ccf873b12c86a2d165003ef730c7f5967fcf165bbe50a05f783" Oct 01 14:12:41 crc kubenswrapper[4851]: I1001 14:12:41.926543 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83169421e7924ccf873b12c86a2d165003ef730c7f5967fcf165bbe50a05f783"} err="failed to get container status \"83169421e7924ccf873b12c86a2d165003ef730c7f5967fcf165bbe50a05f783\": rpc error: code = NotFound desc = could not find container \"83169421e7924ccf873b12c86a2d165003ef730c7f5967fcf165bbe50a05f783\": container with ID starting with 83169421e7924ccf873b12c86a2d165003ef730c7f5967fcf165bbe50a05f783 not found: ID does not exist" Oct 01 14:12:42 crc kubenswrapper[4851]: I1001 14:12:42.356198 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a19425f-e0da-4fd3-9d48-d811f1199303" path="/var/lib/kubelet/pods/3a19425f-e0da-4fd3-9d48-d811f1199303/volumes" Oct 01 14:13:30 crc kubenswrapper[4851]: I1001 14:13:30.049984 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:13:30 crc kubenswrapper[4851]: I1001 14:13:30.050649 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.075450 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zcmb4"] Oct 01 14:13:37 crc kubenswrapper[4851]: E1001 14:13:37.077070 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a19425f-e0da-4fd3-9d48-d811f1199303" containerName="registry-server" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.077105 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a19425f-e0da-4fd3-9d48-d811f1199303" containerName="registry-server" Oct 01 14:13:37 crc kubenswrapper[4851]: E1001 14:13:37.077144 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a19425f-e0da-4fd3-9d48-d811f1199303" containerName="extract-content" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.077158 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a19425f-e0da-4fd3-9d48-d811f1199303" containerName="extract-content" Oct 01 14:13:37 crc kubenswrapper[4851]: E1001 14:13:37.077219 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a19425f-e0da-4fd3-9d48-d811f1199303" containerName="extract-utilities" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.077233 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a19425f-e0da-4fd3-9d48-d811f1199303" containerName="extract-utilities" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.077648 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a19425f-e0da-4fd3-9d48-d811f1199303" containerName="registry-server" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.080380 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.087648 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zcmb4"] Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.141669 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c9a0315-50c2-42fd-adcd-068dbd90296d-utilities\") pod \"certified-operators-zcmb4\" (UID: \"2c9a0315-50c2-42fd-adcd-068dbd90296d\") " pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.142137 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c9a0315-50c2-42fd-adcd-068dbd90296d-catalog-content\") pod \"certified-operators-zcmb4\" (UID: \"2c9a0315-50c2-42fd-adcd-068dbd90296d\") " pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.142198 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgljg\" (UniqueName: \"kubernetes.io/projected/2c9a0315-50c2-42fd-adcd-068dbd90296d-kube-api-access-xgljg\") pod \"certified-operators-zcmb4\" (UID: \"2c9a0315-50c2-42fd-adcd-068dbd90296d\") " pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.243983 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c9a0315-50c2-42fd-adcd-068dbd90296d-utilities\") pod \"certified-operators-zcmb4\" (UID: \"2c9a0315-50c2-42fd-adcd-068dbd90296d\") " pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.244114 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c9a0315-50c2-42fd-adcd-068dbd90296d-catalog-content\") pod \"certified-operators-zcmb4\" (UID: \"2c9a0315-50c2-42fd-adcd-068dbd90296d\") " pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.244196 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgljg\" (UniqueName: \"kubernetes.io/projected/2c9a0315-50c2-42fd-adcd-068dbd90296d-kube-api-access-xgljg\") pod \"certified-operators-zcmb4\" (UID: \"2c9a0315-50c2-42fd-adcd-068dbd90296d\") " pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.245023 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c9a0315-50c2-42fd-adcd-068dbd90296d-catalog-content\") pod \"certified-operators-zcmb4\" (UID: \"2c9a0315-50c2-42fd-adcd-068dbd90296d\") " pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.245018 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c9a0315-50c2-42fd-adcd-068dbd90296d-utilities\") pod \"certified-operators-zcmb4\" (UID: \"2c9a0315-50c2-42fd-adcd-068dbd90296d\") " pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.266836 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgljg\" (UniqueName: \"kubernetes.io/projected/2c9a0315-50c2-42fd-adcd-068dbd90296d-kube-api-access-xgljg\") pod \"certified-operators-zcmb4\" (UID: \"2c9a0315-50c2-42fd-adcd-068dbd90296d\") " pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.421235 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:37 crc kubenswrapper[4851]: I1001 14:13:37.986037 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zcmb4"] Oct 01 14:13:38 crc kubenswrapper[4851]: I1001 14:13:38.464615 4851 generic.go:334] "Generic (PLEG): container finished" podID="2c9a0315-50c2-42fd-adcd-068dbd90296d" containerID="ab863cfbaa52739d45ab3363f78702344241a9a55bfec1fce7fd72f8fcca0c45" exitCode=0 Oct 01 14:13:38 crc kubenswrapper[4851]: I1001 14:13:38.464722 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcmb4" event={"ID":"2c9a0315-50c2-42fd-adcd-068dbd90296d","Type":"ContainerDied","Data":"ab863cfbaa52739d45ab3363f78702344241a9a55bfec1fce7fd72f8fcca0c45"} Oct 01 14:13:38 crc kubenswrapper[4851]: I1001 14:13:38.464934 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcmb4" event={"ID":"2c9a0315-50c2-42fd-adcd-068dbd90296d","Type":"ContainerStarted","Data":"79ebafa607352c709aa35d5db546fa5f0b3fd7cf5de8d92bef53ff76902beb97"} Oct 01 14:13:39 crc kubenswrapper[4851]: I1001 14:13:39.478396 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcmb4" event={"ID":"2c9a0315-50c2-42fd-adcd-068dbd90296d","Type":"ContainerStarted","Data":"66eb3c5d6edfff73eab0aa85ea4ec21db27e2fc4579f2ee301f4308e84cc1366"} Oct 01 14:13:40 crc kubenswrapper[4851]: I1001 14:13:40.490776 4851 generic.go:334] "Generic (PLEG): container finished" podID="2c9a0315-50c2-42fd-adcd-068dbd90296d" containerID="66eb3c5d6edfff73eab0aa85ea4ec21db27e2fc4579f2ee301f4308e84cc1366" exitCode=0 Oct 01 14:13:40 crc kubenswrapper[4851]: I1001 14:13:40.490835 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcmb4" event={"ID":"2c9a0315-50c2-42fd-adcd-068dbd90296d","Type":"ContainerDied","Data":"66eb3c5d6edfff73eab0aa85ea4ec21db27e2fc4579f2ee301f4308e84cc1366"} Oct 01 14:13:41 crc kubenswrapper[4851]: I1001 14:13:41.504487 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcmb4" event={"ID":"2c9a0315-50c2-42fd-adcd-068dbd90296d","Type":"ContainerStarted","Data":"50d60366d7e2e3f0497b48a9bcfb7fb9e4dc070add51b85a172c056e3ff8fb96"} Oct 01 14:13:41 crc kubenswrapper[4851]: I1001 14:13:41.531769 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zcmb4" podStartSLOduration=2.105046102 podStartE2EDuration="4.531754021s" podCreationTimestamp="2025-10-01 14:13:37 +0000 UTC" firstStartedPulling="2025-10-01 14:13:38.466536241 +0000 UTC m=+4826.811653757" lastFinishedPulling="2025-10-01 14:13:40.89324419 +0000 UTC m=+4829.238361676" observedRunningTime="2025-10-01 14:13:41.528886829 +0000 UTC m=+4829.874004355" watchObservedRunningTime="2025-10-01 14:13:41.531754021 +0000 UTC m=+4829.876871507" Oct 01 14:13:47 crc kubenswrapper[4851]: I1001 14:13:47.422291 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:47 crc kubenswrapper[4851]: I1001 14:13:47.423202 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:47 crc kubenswrapper[4851]: I1001 14:13:47.511956 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:47 crc kubenswrapper[4851]: I1001 14:13:47.633406 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:47 crc kubenswrapper[4851]: I1001 14:13:47.759804 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zcmb4"] Oct 01 14:13:49 crc kubenswrapper[4851]: I1001 14:13:49.585180 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zcmb4" podUID="2c9a0315-50c2-42fd-adcd-068dbd90296d" containerName="registry-server" containerID="cri-o://50d60366d7e2e3f0497b48a9bcfb7fb9e4dc070add51b85a172c056e3ff8fb96" gracePeriod=2 Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.140352 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.206241 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c9a0315-50c2-42fd-adcd-068dbd90296d-catalog-content\") pod \"2c9a0315-50c2-42fd-adcd-068dbd90296d\" (UID: \"2c9a0315-50c2-42fd-adcd-068dbd90296d\") " Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.206284 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgljg\" (UniqueName: \"kubernetes.io/projected/2c9a0315-50c2-42fd-adcd-068dbd90296d-kube-api-access-xgljg\") pod \"2c9a0315-50c2-42fd-adcd-068dbd90296d\" (UID: \"2c9a0315-50c2-42fd-adcd-068dbd90296d\") " Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.206327 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c9a0315-50c2-42fd-adcd-068dbd90296d-utilities\") pod \"2c9a0315-50c2-42fd-adcd-068dbd90296d\" (UID: \"2c9a0315-50c2-42fd-adcd-068dbd90296d\") " Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.208439 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c9a0315-50c2-42fd-adcd-068dbd90296d-utilities" (OuterVolumeSpecName: "utilities") pod "2c9a0315-50c2-42fd-adcd-068dbd90296d" (UID: "2c9a0315-50c2-42fd-adcd-068dbd90296d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.215127 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c9a0315-50c2-42fd-adcd-068dbd90296d-kube-api-access-xgljg" (OuterVolumeSpecName: "kube-api-access-xgljg") pod "2c9a0315-50c2-42fd-adcd-068dbd90296d" (UID: "2c9a0315-50c2-42fd-adcd-068dbd90296d"). InnerVolumeSpecName "kube-api-access-xgljg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.276039 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c9a0315-50c2-42fd-adcd-068dbd90296d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c9a0315-50c2-42fd-adcd-068dbd90296d" (UID: "2c9a0315-50c2-42fd-adcd-068dbd90296d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.309539 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c9a0315-50c2-42fd-adcd-068dbd90296d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.309599 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgljg\" (UniqueName: \"kubernetes.io/projected/2c9a0315-50c2-42fd-adcd-068dbd90296d-kube-api-access-xgljg\") on node \"crc\" DevicePath \"\"" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.309622 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c9a0315-50c2-42fd-adcd-068dbd90296d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.611809 4851 generic.go:334] "Generic (PLEG): container finished" podID="2c9a0315-50c2-42fd-adcd-068dbd90296d" containerID="50d60366d7e2e3f0497b48a9bcfb7fb9e4dc070add51b85a172c056e3ff8fb96" exitCode=0 Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.612062 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcmb4" event={"ID":"2c9a0315-50c2-42fd-adcd-068dbd90296d","Type":"ContainerDied","Data":"50d60366d7e2e3f0497b48a9bcfb7fb9e4dc070add51b85a172c056e3ff8fb96"} Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.613210 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcmb4" event={"ID":"2c9a0315-50c2-42fd-adcd-068dbd90296d","Type":"ContainerDied","Data":"79ebafa607352c709aa35d5db546fa5f0b3fd7cf5de8d92bef53ff76902beb97"} Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.613353 4851 scope.go:117] "RemoveContainer" containerID="50d60366d7e2e3f0497b48a9bcfb7fb9e4dc070add51b85a172c056e3ff8fb96" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.612159 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zcmb4" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.654600 4851 scope.go:117] "RemoveContainer" containerID="66eb3c5d6edfff73eab0aa85ea4ec21db27e2fc4579f2ee301f4308e84cc1366" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.656167 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zcmb4"] Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.667230 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zcmb4"] Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.687546 4851 scope.go:117] "RemoveContainer" containerID="ab863cfbaa52739d45ab3363f78702344241a9a55bfec1fce7fd72f8fcca0c45" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.773569 4851 scope.go:117] "RemoveContainer" containerID="50d60366d7e2e3f0497b48a9bcfb7fb9e4dc070add51b85a172c056e3ff8fb96" Oct 01 14:13:50 crc kubenswrapper[4851]: E1001 14:13:50.774393 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d60366d7e2e3f0497b48a9bcfb7fb9e4dc070add51b85a172c056e3ff8fb96\": container with ID starting with 50d60366d7e2e3f0497b48a9bcfb7fb9e4dc070add51b85a172c056e3ff8fb96 not found: ID does not exist" containerID="50d60366d7e2e3f0497b48a9bcfb7fb9e4dc070add51b85a172c056e3ff8fb96" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.774599 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d60366d7e2e3f0497b48a9bcfb7fb9e4dc070add51b85a172c056e3ff8fb96"} err="failed to get container status \"50d60366d7e2e3f0497b48a9bcfb7fb9e4dc070add51b85a172c056e3ff8fb96\": rpc error: code = NotFound desc = could not find container \"50d60366d7e2e3f0497b48a9bcfb7fb9e4dc070add51b85a172c056e3ff8fb96\": container with ID starting with 50d60366d7e2e3f0497b48a9bcfb7fb9e4dc070add51b85a172c056e3ff8fb96 not found: ID does not exist" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.774722 4851 scope.go:117] "RemoveContainer" containerID="66eb3c5d6edfff73eab0aa85ea4ec21db27e2fc4579f2ee301f4308e84cc1366" Oct 01 14:13:50 crc kubenswrapper[4851]: E1001 14:13:50.775263 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66eb3c5d6edfff73eab0aa85ea4ec21db27e2fc4579f2ee301f4308e84cc1366\": container with ID starting with 66eb3c5d6edfff73eab0aa85ea4ec21db27e2fc4579f2ee301f4308e84cc1366 not found: ID does not exist" containerID="66eb3c5d6edfff73eab0aa85ea4ec21db27e2fc4579f2ee301f4308e84cc1366" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.775299 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66eb3c5d6edfff73eab0aa85ea4ec21db27e2fc4579f2ee301f4308e84cc1366"} err="failed to get container status \"66eb3c5d6edfff73eab0aa85ea4ec21db27e2fc4579f2ee301f4308e84cc1366\": rpc error: code = NotFound desc = could not find container \"66eb3c5d6edfff73eab0aa85ea4ec21db27e2fc4579f2ee301f4308e84cc1366\": container with ID starting with 66eb3c5d6edfff73eab0aa85ea4ec21db27e2fc4579f2ee301f4308e84cc1366 not found: ID does not exist" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.775326 4851 scope.go:117] "RemoveContainer" containerID="ab863cfbaa52739d45ab3363f78702344241a9a55bfec1fce7fd72f8fcca0c45" Oct 01 14:13:50 crc kubenswrapper[4851]: E1001 14:13:50.775660 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab863cfbaa52739d45ab3363f78702344241a9a55bfec1fce7fd72f8fcca0c45\": container with ID starting with ab863cfbaa52739d45ab3363f78702344241a9a55bfec1fce7fd72f8fcca0c45 not found: ID does not exist" containerID="ab863cfbaa52739d45ab3363f78702344241a9a55bfec1fce7fd72f8fcca0c45" Oct 01 14:13:50 crc kubenswrapper[4851]: I1001 14:13:50.775698 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab863cfbaa52739d45ab3363f78702344241a9a55bfec1fce7fd72f8fcca0c45"} err="failed to get container status \"ab863cfbaa52739d45ab3363f78702344241a9a55bfec1fce7fd72f8fcca0c45\": rpc error: code = NotFound desc = could not find container \"ab863cfbaa52739d45ab3363f78702344241a9a55bfec1fce7fd72f8fcca0c45\": container with ID starting with ab863cfbaa52739d45ab3363f78702344241a9a55bfec1fce7fd72f8fcca0c45 not found: ID does not exist" Oct 01 14:13:52 crc kubenswrapper[4851]: I1001 14:13:52.350097 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c9a0315-50c2-42fd-adcd-068dbd90296d" path="/var/lib/kubelet/pods/2c9a0315-50c2-42fd-adcd-068dbd90296d/volumes" Oct 01 14:14:00 crc kubenswrapper[4851]: I1001 14:14:00.050079 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:14:00 crc kubenswrapper[4851]: I1001 14:14:00.050720 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:14:30 crc kubenswrapper[4851]: I1001 14:14:30.050201 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:14:30 crc kubenswrapper[4851]: I1001 14:14:30.050902 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:14:30 crc kubenswrapper[4851]: I1001 14:14:30.050964 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 14:14:30 crc kubenswrapper[4851]: I1001 14:14:30.052099 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3bedab356653f0132308f759754cf021851ccb18415c5aa7a1c2d58cad13e034"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:14:30 crc kubenswrapper[4851]: I1001 14:14:30.052197 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://3bedab356653f0132308f759754cf021851ccb18415c5aa7a1c2d58cad13e034" gracePeriod=600 Oct 01 14:14:30 crc kubenswrapper[4851]: I1001 14:14:30.194815 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="3bedab356653f0132308f759754cf021851ccb18415c5aa7a1c2d58cad13e034" exitCode=0 Oct 01 14:14:30 crc kubenswrapper[4851]: I1001 14:14:30.194855 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"3bedab356653f0132308f759754cf021851ccb18415c5aa7a1c2d58cad13e034"} Oct 01 14:14:30 crc kubenswrapper[4851]: I1001 14:14:30.194888 4851 scope.go:117] "RemoveContainer" containerID="972e98d79426fa8e240159a272e8d018965dd9376718d213f8db21467e8a0290" Oct 01 14:14:31 crc kubenswrapper[4851]: I1001 14:14:31.210028 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed"} Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.173094 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz"] Oct 01 14:15:00 crc kubenswrapper[4851]: E1001 14:15:00.174110 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9a0315-50c2-42fd-adcd-068dbd90296d" containerName="registry-server" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.174129 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9a0315-50c2-42fd-adcd-068dbd90296d" containerName="registry-server" Oct 01 14:15:00 crc kubenswrapper[4851]: E1001 14:15:00.174159 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9a0315-50c2-42fd-adcd-068dbd90296d" containerName="extract-utilities" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.174167 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9a0315-50c2-42fd-adcd-068dbd90296d" containerName="extract-utilities" Oct 01 14:15:00 crc kubenswrapper[4851]: E1001 14:15:00.174183 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9a0315-50c2-42fd-adcd-068dbd90296d" containerName="extract-content" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.174191 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9a0315-50c2-42fd-adcd-068dbd90296d" containerName="extract-content" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.174471 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c9a0315-50c2-42fd-adcd-068dbd90296d" containerName="registry-server" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.175397 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.178529 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.185343 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.187600 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz"] Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.193379 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c12457c8-49bb-47a0-80cd-c78a5efe2062-secret-volume\") pod \"collect-profiles-29322135-4lhfz\" (UID: \"c12457c8-49bb-47a0-80cd-c78a5efe2062\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.193573 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz2xf\" (UniqueName: \"kubernetes.io/projected/c12457c8-49bb-47a0-80cd-c78a5efe2062-kube-api-access-mz2xf\") pod \"collect-profiles-29322135-4lhfz\" (UID: \"c12457c8-49bb-47a0-80cd-c78a5efe2062\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.193731 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c12457c8-49bb-47a0-80cd-c78a5efe2062-config-volume\") pod \"collect-profiles-29322135-4lhfz\" (UID: \"c12457c8-49bb-47a0-80cd-c78a5efe2062\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.296303 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c12457c8-49bb-47a0-80cd-c78a5efe2062-config-volume\") pod \"collect-profiles-29322135-4lhfz\" (UID: \"c12457c8-49bb-47a0-80cd-c78a5efe2062\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.296550 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c12457c8-49bb-47a0-80cd-c78a5efe2062-secret-volume\") pod \"collect-profiles-29322135-4lhfz\" (UID: \"c12457c8-49bb-47a0-80cd-c78a5efe2062\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.296675 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz2xf\" (UniqueName: \"kubernetes.io/projected/c12457c8-49bb-47a0-80cd-c78a5efe2062-kube-api-access-mz2xf\") pod \"collect-profiles-29322135-4lhfz\" (UID: \"c12457c8-49bb-47a0-80cd-c78a5efe2062\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.297718 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c12457c8-49bb-47a0-80cd-c78a5efe2062-config-volume\") pod \"collect-profiles-29322135-4lhfz\" (UID: \"c12457c8-49bb-47a0-80cd-c78a5efe2062\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.303791 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c12457c8-49bb-47a0-80cd-c78a5efe2062-secret-volume\") pod \"collect-profiles-29322135-4lhfz\" (UID: \"c12457c8-49bb-47a0-80cd-c78a5efe2062\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.316598 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz2xf\" (UniqueName: \"kubernetes.io/projected/c12457c8-49bb-47a0-80cd-c78a5efe2062-kube-api-access-mz2xf\") pod \"collect-profiles-29322135-4lhfz\" (UID: \"c12457c8-49bb-47a0-80cd-c78a5efe2062\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.497695 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" Oct 01 14:15:00 crc kubenswrapper[4851]: I1001 14:15:00.998520 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz"] Oct 01 14:15:01 crc kubenswrapper[4851]: I1001 14:15:01.592042 4851 generic.go:334] "Generic (PLEG): container finished" podID="c12457c8-49bb-47a0-80cd-c78a5efe2062" containerID="9574136fd8d51e92fea78a1ca16e8432152466baae7a7204d3e9ec11bbdb90ff" exitCode=0 Oct 01 14:15:01 crc kubenswrapper[4851]: I1001 14:15:01.592375 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" event={"ID":"c12457c8-49bb-47a0-80cd-c78a5efe2062","Type":"ContainerDied","Data":"9574136fd8d51e92fea78a1ca16e8432152466baae7a7204d3e9ec11bbdb90ff"} Oct 01 14:15:01 crc kubenswrapper[4851]: I1001 14:15:01.592405 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" event={"ID":"c12457c8-49bb-47a0-80cd-c78a5efe2062","Type":"ContainerStarted","Data":"d3dda1a21e35b9fae4f338221a8d2fa913a77cbae95de8247acfb0a8b085342f"} Oct 01 14:15:03 crc kubenswrapper[4851]: I1001 14:15:03.086585 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" Oct 01 14:15:03 crc kubenswrapper[4851]: I1001 14:15:03.270607 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c12457c8-49bb-47a0-80cd-c78a5efe2062-config-volume\") pod \"c12457c8-49bb-47a0-80cd-c78a5efe2062\" (UID: \"c12457c8-49bb-47a0-80cd-c78a5efe2062\") " Oct 01 14:15:03 crc kubenswrapper[4851]: I1001 14:15:03.271111 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c12457c8-49bb-47a0-80cd-c78a5efe2062-secret-volume\") pod \"c12457c8-49bb-47a0-80cd-c78a5efe2062\" (UID: \"c12457c8-49bb-47a0-80cd-c78a5efe2062\") " Oct 01 14:15:03 crc kubenswrapper[4851]: I1001 14:15:03.271252 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz2xf\" (UniqueName: \"kubernetes.io/projected/c12457c8-49bb-47a0-80cd-c78a5efe2062-kube-api-access-mz2xf\") pod \"c12457c8-49bb-47a0-80cd-c78a5efe2062\" (UID: \"c12457c8-49bb-47a0-80cd-c78a5efe2062\") " Oct 01 14:15:03 crc kubenswrapper[4851]: I1001 14:15:03.271461 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c12457c8-49bb-47a0-80cd-c78a5efe2062-config-volume" (OuterVolumeSpecName: "config-volume") pod "c12457c8-49bb-47a0-80cd-c78a5efe2062" (UID: "c12457c8-49bb-47a0-80cd-c78a5efe2062"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:15:03 crc kubenswrapper[4851]: I1001 14:15:03.271848 4851 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c12457c8-49bb-47a0-80cd-c78a5efe2062-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:03 crc kubenswrapper[4851]: I1001 14:15:03.277671 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12457c8-49bb-47a0-80cd-c78a5efe2062-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c12457c8-49bb-47a0-80cd-c78a5efe2062" (UID: "c12457c8-49bb-47a0-80cd-c78a5efe2062"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:15:03 crc kubenswrapper[4851]: I1001 14:15:03.278402 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c12457c8-49bb-47a0-80cd-c78a5efe2062-kube-api-access-mz2xf" (OuterVolumeSpecName: "kube-api-access-mz2xf") pod "c12457c8-49bb-47a0-80cd-c78a5efe2062" (UID: "c12457c8-49bb-47a0-80cd-c78a5efe2062"). InnerVolumeSpecName "kube-api-access-mz2xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:15:03 crc kubenswrapper[4851]: I1001 14:15:03.373918 4851 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c12457c8-49bb-47a0-80cd-c78a5efe2062-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:03 crc kubenswrapper[4851]: I1001 14:15:03.373963 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz2xf\" (UniqueName: \"kubernetes.io/projected/c12457c8-49bb-47a0-80cd-c78a5efe2062-kube-api-access-mz2xf\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:03 crc kubenswrapper[4851]: I1001 14:15:03.616483 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" event={"ID":"c12457c8-49bb-47a0-80cd-c78a5efe2062","Type":"ContainerDied","Data":"d3dda1a21e35b9fae4f338221a8d2fa913a77cbae95de8247acfb0a8b085342f"} Oct 01 14:15:03 crc kubenswrapper[4851]: I1001 14:15:03.616562 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3dda1a21e35b9fae4f338221a8d2fa913a77cbae95de8247acfb0a8b085342f" Oct 01 14:15:03 crc kubenswrapper[4851]: I1001 14:15:03.616637 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-4lhfz" Oct 01 14:15:04 crc kubenswrapper[4851]: I1001 14:15:04.179967 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd"] Oct 01 14:15:04 crc kubenswrapper[4851]: I1001 14:15:04.191239 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-6hcgd"] Oct 01 14:15:04 crc kubenswrapper[4851]: I1001 14:15:04.343550 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffd845e9-6f20-4b95-9d0d-b71c66d0ea05" path="/var/lib/kubelet/pods/ffd845e9-6f20-4b95-9d0d-b71c66d0ea05/volumes" Oct 01 14:15:19 crc kubenswrapper[4851]: I1001 14:15:19.930991 4851 scope.go:117] "RemoveContainer" containerID="1a1570d4f6b5d5f0c56352a31d7a4b8f766c47c9db00f615b20952e6d4d59e87" Oct 01 14:16:30 crc kubenswrapper[4851]: I1001 14:16:30.050759 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:16:30 crc kubenswrapper[4851]: I1001 14:16:30.052778 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:17:00 crc kubenswrapper[4851]: I1001 14:17:00.049925 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:17:00 crc kubenswrapper[4851]: I1001 14:17:00.051497 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:17:30 crc kubenswrapper[4851]: I1001 14:17:30.049914 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:17:30 crc kubenswrapper[4851]: I1001 14:17:30.050843 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:17:30 crc kubenswrapper[4851]: I1001 14:17:30.050915 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 14:17:30 crc kubenswrapper[4851]: I1001 14:17:30.052253 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:17:30 crc kubenswrapper[4851]: I1001 14:17:30.052376 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" gracePeriod=600 Oct 01 14:17:30 crc kubenswrapper[4851]: E1001 14:17:30.191325 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:17:30 crc kubenswrapper[4851]: I1001 14:17:30.411862 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" exitCode=0 Oct 01 14:17:30 crc kubenswrapper[4851]: I1001 14:17:30.411908 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed"} Oct 01 14:17:30 crc kubenswrapper[4851]: I1001 14:17:30.412321 4851 scope.go:117] "RemoveContainer" containerID="3bedab356653f0132308f759754cf021851ccb18415c5aa7a1c2d58cad13e034" Oct 01 14:17:30 crc kubenswrapper[4851]: I1001 14:17:30.413368 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:17:30 crc kubenswrapper[4851]: E1001 14:17:30.413944 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:17:45 crc kubenswrapper[4851]: I1001 14:17:45.327964 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:17:45 crc kubenswrapper[4851]: E1001 14:17:45.328701 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:17:57 crc kubenswrapper[4851]: I1001 14:17:57.330047 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:17:57 crc kubenswrapper[4851]: E1001 14:17:57.331267 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:18:10 crc kubenswrapper[4851]: I1001 14:18:10.329388 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:18:10 crc kubenswrapper[4851]: E1001 14:18:10.330593 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:18:21 crc kubenswrapper[4851]: I1001 14:18:21.328130 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:18:21 crc kubenswrapper[4851]: E1001 14:18:21.328815 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:18:36 crc kubenswrapper[4851]: I1001 14:18:36.328947 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:18:36 crc kubenswrapper[4851]: E1001 14:18:36.329922 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:18:49 crc kubenswrapper[4851]: I1001 14:18:49.329352 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:18:49 crc kubenswrapper[4851]: E1001 14:18:49.330494 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.016676 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gf6xf"] Oct 01 14:18:50 crc kubenswrapper[4851]: E1001 14:18:50.017218 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12457c8-49bb-47a0-80cd-c78a5efe2062" containerName="collect-profiles" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.017259 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12457c8-49bb-47a0-80cd-c78a5efe2062" containerName="collect-profiles" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.017542 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="c12457c8-49bb-47a0-80cd-c78a5efe2062" containerName="collect-profiles" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.019377 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.031657 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gf6xf"] Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.098130 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55mrj\" (UniqueName: \"kubernetes.io/projected/bf317c91-e925-4224-bbfe-9e408460ca58-kube-api-access-55mrj\") pod \"community-operators-gf6xf\" (UID: \"bf317c91-e925-4224-bbfe-9e408460ca58\") " pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.098317 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf317c91-e925-4224-bbfe-9e408460ca58-catalog-content\") pod \"community-operators-gf6xf\" (UID: \"bf317c91-e925-4224-bbfe-9e408460ca58\") " pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.098371 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf317c91-e925-4224-bbfe-9e408460ca58-utilities\") pod \"community-operators-gf6xf\" (UID: \"bf317c91-e925-4224-bbfe-9e408460ca58\") " pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.200773 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55mrj\" (UniqueName: \"kubernetes.io/projected/bf317c91-e925-4224-bbfe-9e408460ca58-kube-api-access-55mrj\") pod \"community-operators-gf6xf\" (UID: \"bf317c91-e925-4224-bbfe-9e408460ca58\") " pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.200984 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf317c91-e925-4224-bbfe-9e408460ca58-catalog-content\") pod \"community-operators-gf6xf\" (UID: \"bf317c91-e925-4224-bbfe-9e408460ca58\") " pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.201056 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf317c91-e925-4224-bbfe-9e408460ca58-utilities\") pod \"community-operators-gf6xf\" (UID: \"bf317c91-e925-4224-bbfe-9e408460ca58\") " pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.201594 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf317c91-e925-4224-bbfe-9e408460ca58-catalog-content\") pod \"community-operators-gf6xf\" (UID: \"bf317c91-e925-4224-bbfe-9e408460ca58\") " pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.201646 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf317c91-e925-4224-bbfe-9e408460ca58-utilities\") pod \"community-operators-gf6xf\" (UID: \"bf317c91-e925-4224-bbfe-9e408460ca58\") " pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.221281 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55mrj\" (UniqueName: \"kubernetes.io/projected/bf317c91-e925-4224-bbfe-9e408460ca58-kube-api-access-55mrj\") pod \"community-operators-gf6xf\" (UID: \"bf317c91-e925-4224-bbfe-9e408460ca58\") " pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.341695 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:18:50 crc kubenswrapper[4851]: I1001 14:18:50.847617 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gf6xf"] Oct 01 14:18:51 crc kubenswrapper[4851]: I1001 14:18:51.404715 4851 generic.go:334] "Generic (PLEG): container finished" podID="bf317c91-e925-4224-bbfe-9e408460ca58" containerID="d60c21a8cd9f1ab95a71874cb40bba5d8d078771c04524d7a69c679ab594b98a" exitCode=0 Oct 01 14:18:51 crc kubenswrapper[4851]: I1001 14:18:51.404817 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gf6xf" event={"ID":"bf317c91-e925-4224-bbfe-9e408460ca58","Type":"ContainerDied","Data":"d60c21a8cd9f1ab95a71874cb40bba5d8d078771c04524d7a69c679ab594b98a"} Oct 01 14:18:51 crc kubenswrapper[4851]: I1001 14:18:51.405077 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gf6xf" event={"ID":"bf317c91-e925-4224-bbfe-9e408460ca58","Type":"ContainerStarted","Data":"f5af605ff22fd20e4e0e71a9849a10a404f6142cb574ae71e13a70699f537d82"} Oct 01 14:18:51 crc kubenswrapper[4851]: I1001 14:18:51.407403 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:18:52 crc kubenswrapper[4851]: I1001 14:18:52.417881 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gf6xf" event={"ID":"bf317c91-e925-4224-bbfe-9e408460ca58","Type":"ContainerStarted","Data":"5a4ab9ada0bb53412af52eff4126f7d2b3644db7d1503c208abe507a104c7153"} Oct 01 14:18:53 crc kubenswrapper[4851]: I1001 14:18:53.427771 4851 generic.go:334] "Generic (PLEG): container finished" podID="bf317c91-e925-4224-bbfe-9e408460ca58" containerID="5a4ab9ada0bb53412af52eff4126f7d2b3644db7d1503c208abe507a104c7153" exitCode=0 Oct 01 14:18:53 crc kubenswrapper[4851]: I1001 14:18:53.427811 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gf6xf" event={"ID":"bf317c91-e925-4224-bbfe-9e408460ca58","Type":"ContainerDied","Data":"5a4ab9ada0bb53412af52eff4126f7d2b3644db7d1503c208abe507a104c7153"} Oct 01 14:18:54 crc kubenswrapper[4851]: I1001 14:18:54.449053 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gf6xf" event={"ID":"bf317c91-e925-4224-bbfe-9e408460ca58","Type":"ContainerStarted","Data":"cbe77c052337c04226f0c2f0c7c348addd828538a06481b1a0dcfa79b503bfb4"} Oct 01 14:18:54 crc kubenswrapper[4851]: I1001 14:18:54.483115 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gf6xf" podStartSLOduration=2.7072071920000003 podStartE2EDuration="5.483086231s" podCreationTimestamp="2025-10-01 14:18:49 +0000 UTC" firstStartedPulling="2025-10-01 14:18:51.407104809 +0000 UTC m=+5139.752222305" lastFinishedPulling="2025-10-01 14:18:54.182983858 +0000 UTC m=+5142.528101344" observedRunningTime="2025-10-01 14:18:54.47183947 +0000 UTC m=+5142.816956956" watchObservedRunningTime="2025-10-01 14:18:54.483086231 +0000 UTC m=+5142.828203767" Oct 01 14:19:00 crc kubenswrapper[4851]: I1001 14:19:00.345637 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:19:00 crc kubenswrapper[4851]: I1001 14:19:00.346294 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:19:00 crc kubenswrapper[4851]: I1001 14:19:00.411358 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:19:00 crc kubenswrapper[4851]: I1001 14:19:00.586814 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:19:00 crc kubenswrapper[4851]: I1001 14:19:00.654428 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gf6xf"] Oct 01 14:19:02 crc kubenswrapper[4851]: I1001 14:19:02.337696 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:19:02 crc kubenswrapper[4851]: E1001 14:19:02.338577 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:19:02 crc kubenswrapper[4851]: I1001 14:19:02.551084 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gf6xf" podUID="bf317c91-e925-4224-bbfe-9e408460ca58" containerName="registry-server" containerID="cri-o://cbe77c052337c04226f0c2f0c7c348addd828538a06481b1a0dcfa79b503bfb4" gracePeriod=2 Oct 01 14:19:03 crc kubenswrapper[4851]: I1001 14:19:03.578254 4851 generic.go:334] "Generic (PLEG): container finished" podID="bf317c91-e925-4224-bbfe-9e408460ca58" containerID="cbe77c052337c04226f0c2f0c7c348addd828538a06481b1a0dcfa79b503bfb4" exitCode=0 Oct 01 14:19:03 crc kubenswrapper[4851]: I1001 14:19:03.578344 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gf6xf" event={"ID":"bf317c91-e925-4224-bbfe-9e408460ca58","Type":"ContainerDied","Data":"cbe77c052337c04226f0c2f0c7c348addd828538a06481b1a0dcfa79b503bfb4"} Oct 01 14:19:03 crc kubenswrapper[4851]: I1001 14:19:03.578778 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gf6xf" event={"ID":"bf317c91-e925-4224-bbfe-9e408460ca58","Type":"ContainerDied","Data":"f5af605ff22fd20e4e0e71a9849a10a404f6142cb574ae71e13a70699f537d82"} Oct 01 14:19:03 crc kubenswrapper[4851]: I1001 14:19:03.578809 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5af605ff22fd20e4e0e71a9849a10a404f6142cb574ae71e13a70699f537d82" Oct 01 14:19:03 crc kubenswrapper[4851]: I1001 14:19:03.659796 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:19:03 crc kubenswrapper[4851]: I1001 14:19:03.727239 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf317c91-e925-4224-bbfe-9e408460ca58-utilities\") pod \"bf317c91-e925-4224-bbfe-9e408460ca58\" (UID: \"bf317c91-e925-4224-bbfe-9e408460ca58\") " Oct 01 14:19:03 crc kubenswrapper[4851]: I1001 14:19:03.727805 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf317c91-e925-4224-bbfe-9e408460ca58-catalog-content\") pod \"bf317c91-e925-4224-bbfe-9e408460ca58\" (UID: \"bf317c91-e925-4224-bbfe-9e408460ca58\") " Oct 01 14:19:03 crc kubenswrapper[4851]: I1001 14:19:03.727957 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55mrj\" (UniqueName: \"kubernetes.io/projected/bf317c91-e925-4224-bbfe-9e408460ca58-kube-api-access-55mrj\") pod \"bf317c91-e925-4224-bbfe-9e408460ca58\" (UID: \"bf317c91-e925-4224-bbfe-9e408460ca58\") " Oct 01 14:19:03 crc kubenswrapper[4851]: I1001 14:19:03.728153 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf317c91-e925-4224-bbfe-9e408460ca58-utilities" (OuterVolumeSpecName: "utilities") pod "bf317c91-e925-4224-bbfe-9e408460ca58" (UID: "bf317c91-e925-4224-bbfe-9e408460ca58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:19:03 crc kubenswrapper[4851]: I1001 14:19:03.728547 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf317c91-e925-4224-bbfe-9e408460ca58-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:19:03 crc kubenswrapper[4851]: I1001 14:19:03.737774 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf317c91-e925-4224-bbfe-9e408460ca58-kube-api-access-55mrj" (OuterVolumeSpecName: "kube-api-access-55mrj") pod "bf317c91-e925-4224-bbfe-9e408460ca58" (UID: "bf317c91-e925-4224-bbfe-9e408460ca58"). InnerVolumeSpecName "kube-api-access-55mrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:19:03 crc kubenswrapper[4851]: I1001 14:19:03.785331 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf317c91-e925-4224-bbfe-9e408460ca58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf317c91-e925-4224-bbfe-9e408460ca58" (UID: "bf317c91-e925-4224-bbfe-9e408460ca58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:19:03 crc kubenswrapper[4851]: I1001 14:19:03.829312 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf317c91-e925-4224-bbfe-9e408460ca58-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:19:03 crc kubenswrapper[4851]: I1001 14:19:03.829352 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55mrj\" (UniqueName: \"kubernetes.io/projected/bf317c91-e925-4224-bbfe-9e408460ca58-kube-api-access-55mrj\") on node \"crc\" DevicePath \"\"" Oct 01 14:19:04 crc kubenswrapper[4851]: I1001 14:19:04.588122 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gf6xf" Oct 01 14:19:04 crc kubenswrapper[4851]: I1001 14:19:04.625739 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gf6xf"] Oct 01 14:19:04 crc kubenswrapper[4851]: I1001 14:19:04.633085 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gf6xf"] Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.344204 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf317c91-e925-4224-bbfe-9e408460ca58" path="/var/lib/kubelet/pods/bf317c91-e925-4224-bbfe-9e408460ca58/volumes" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.386819 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hfjk5"] Oct 01 14:19:06 crc kubenswrapper[4851]: E1001 14:19:06.387594 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf317c91-e925-4224-bbfe-9e408460ca58" containerName="extract-utilities" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.387616 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf317c91-e925-4224-bbfe-9e408460ca58" containerName="extract-utilities" Oct 01 14:19:06 crc kubenswrapper[4851]: E1001 14:19:06.387648 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf317c91-e925-4224-bbfe-9e408460ca58" containerName="extract-content" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.387657 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf317c91-e925-4224-bbfe-9e408460ca58" containerName="extract-content" Oct 01 14:19:06 crc kubenswrapper[4851]: E1001 14:19:06.387695 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf317c91-e925-4224-bbfe-9e408460ca58" containerName="registry-server" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.387703 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf317c91-e925-4224-bbfe-9e408460ca58" containerName="registry-server" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.387952 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf317c91-e925-4224-bbfe-9e408460ca58" containerName="registry-server" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.389755 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.399850 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfjk5"] Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.488455 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c696af7-ff23-4519-b874-e4c8f2277e88-utilities\") pod \"redhat-marketplace-hfjk5\" (UID: \"4c696af7-ff23-4519-b874-e4c8f2277e88\") " pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.488596 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c696af7-ff23-4519-b874-e4c8f2277e88-catalog-content\") pod \"redhat-marketplace-hfjk5\" (UID: \"4c696af7-ff23-4519-b874-e4c8f2277e88\") " pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.488629 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srm25\" (UniqueName: \"kubernetes.io/projected/4c696af7-ff23-4519-b874-e4c8f2277e88-kube-api-access-srm25\") pod \"redhat-marketplace-hfjk5\" (UID: \"4c696af7-ff23-4519-b874-e4c8f2277e88\") " pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.590657 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c696af7-ff23-4519-b874-e4c8f2277e88-utilities\") pod \"redhat-marketplace-hfjk5\" (UID: \"4c696af7-ff23-4519-b874-e4c8f2277e88\") " pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.590943 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c696af7-ff23-4519-b874-e4c8f2277e88-catalog-content\") pod \"redhat-marketplace-hfjk5\" (UID: \"4c696af7-ff23-4519-b874-e4c8f2277e88\") " pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.591033 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srm25\" (UniqueName: \"kubernetes.io/projected/4c696af7-ff23-4519-b874-e4c8f2277e88-kube-api-access-srm25\") pod \"redhat-marketplace-hfjk5\" (UID: \"4c696af7-ff23-4519-b874-e4c8f2277e88\") " pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.591312 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c696af7-ff23-4519-b874-e4c8f2277e88-catalog-content\") pod \"redhat-marketplace-hfjk5\" (UID: \"4c696af7-ff23-4519-b874-e4c8f2277e88\") " pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.591593 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c696af7-ff23-4519-b874-e4c8f2277e88-utilities\") pod \"redhat-marketplace-hfjk5\" (UID: \"4c696af7-ff23-4519-b874-e4c8f2277e88\") " pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.612280 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srm25\" (UniqueName: \"kubernetes.io/projected/4c696af7-ff23-4519-b874-e4c8f2277e88-kube-api-access-srm25\") pod \"redhat-marketplace-hfjk5\" (UID: \"4c696af7-ff23-4519-b874-e4c8f2277e88\") " pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:06 crc kubenswrapper[4851]: I1001 14:19:06.706188 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:07 crc kubenswrapper[4851]: I1001 14:19:07.200302 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfjk5"] Oct 01 14:19:07 crc kubenswrapper[4851]: W1001 14:19:07.201056 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c696af7_ff23_4519_b874_e4c8f2277e88.slice/crio-6a68229b28db775fa77c78c8f17edd28b9d565aae6a03987410e8d2ca1be764e WatchSource:0}: Error finding container 6a68229b28db775fa77c78c8f17edd28b9d565aae6a03987410e8d2ca1be764e: Status 404 returned error can't find the container with id 6a68229b28db775fa77c78c8f17edd28b9d565aae6a03987410e8d2ca1be764e Oct 01 14:19:07 crc kubenswrapper[4851]: I1001 14:19:07.627350 4851 generic.go:334] "Generic (PLEG): container finished" podID="4c696af7-ff23-4519-b874-e4c8f2277e88" containerID="77be14ce5489cb70c215a626703fffb5e29690f8cbc3e78661e0aed4acec0fb3" exitCode=0 Oct 01 14:19:07 crc kubenswrapper[4851]: I1001 14:19:07.627424 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfjk5" event={"ID":"4c696af7-ff23-4519-b874-e4c8f2277e88","Type":"ContainerDied","Data":"77be14ce5489cb70c215a626703fffb5e29690f8cbc3e78661e0aed4acec0fb3"} Oct 01 14:19:07 crc kubenswrapper[4851]: I1001 14:19:07.627467 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfjk5" event={"ID":"4c696af7-ff23-4519-b874-e4c8f2277e88","Type":"ContainerStarted","Data":"6a68229b28db775fa77c78c8f17edd28b9d565aae6a03987410e8d2ca1be764e"} Oct 01 14:19:09 crc kubenswrapper[4851]: I1001 14:19:09.653831 4851 generic.go:334] "Generic (PLEG): container finished" podID="4c696af7-ff23-4519-b874-e4c8f2277e88" containerID="86e332c7aeee240556b358b54dd63413283d5ddf05980a3144f87cf0e9f680dd" exitCode=0 Oct 01 14:19:09 crc kubenswrapper[4851]: I1001 14:19:09.653914 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfjk5" event={"ID":"4c696af7-ff23-4519-b874-e4c8f2277e88","Type":"ContainerDied","Data":"86e332c7aeee240556b358b54dd63413283d5ddf05980a3144f87cf0e9f680dd"} Oct 01 14:19:10 crc kubenswrapper[4851]: I1001 14:19:10.674598 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfjk5" event={"ID":"4c696af7-ff23-4519-b874-e4c8f2277e88","Type":"ContainerStarted","Data":"a8685c523d616712a5c805391556b6fcaf94f16fe9bf23af9e134c9c9bc7e7f7"} Oct 01 14:19:10 crc kubenswrapper[4851]: I1001 14:19:10.735713 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hfjk5" podStartSLOduration=2.022369092 podStartE2EDuration="4.735682755s" podCreationTimestamp="2025-10-01 14:19:06 +0000 UTC" firstStartedPulling="2025-10-01 14:19:07.630304884 +0000 UTC m=+5155.975422380" lastFinishedPulling="2025-10-01 14:19:10.343618527 +0000 UTC m=+5158.688736043" observedRunningTime="2025-10-01 14:19:10.721977734 +0000 UTC m=+5159.067095230" watchObservedRunningTime="2025-10-01 14:19:10.735682755 +0000 UTC m=+5159.080800261" Oct 01 14:19:15 crc kubenswrapper[4851]: I1001 14:19:15.329739 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:19:15 crc kubenswrapper[4851]: E1001 14:19:15.330530 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:19:16 crc kubenswrapper[4851]: I1001 14:19:16.707651 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:16 crc kubenswrapper[4851]: I1001 14:19:16.707971 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:16 crc kubenswrapper[4851]: I1001 14:19:16.786189 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:16 crc kubenswrapper[4851]: I1001 14:19:16.852159 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:17 crc kubenswrapper[4851]: I1001 14:19:17.035843 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfjk5"] Oct 01 14:19:18 crc kubenswrapper[4851]: I1001 14:19:18.761899 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hfjk5" podUID="4c696af7-ff23-4519-b874-e4c8f2277e88" containerName="registry-server" containerID="cri-o://a8685c523d616712a5c805391556b6fcaf94f16fe9bf23af9e134c9c9bc7e7f7" gracePeriod=2 Oct 01 14:19:19 crc kubenswrapper[4851]: I1001 14:19:19.777355 4851 generic.go:334] "Generic (PLEG): container finished" podID="4c696af7-ff23-4519-b874-e4c8f2277e88" containerID="a8685c523d616712a5c805391556b6fcaf94f16fe9bf23af9e134c9c9bc7e7f7" exitCode=0 Oct 01 14:19:19 crc kubenswrapper[4851]: I1001 14:19:19.777687 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfjk5" event={"ID":"4c696af7-ff23-4519-b874-e4c8f2277e88","Type":"ContainerDied","Data":"a8685c523d616712a5c805391556b6fcaf94f16fe9bf23af9e134c9c9bc7e7f7"} Oct 01 14:19:19 crc kubenswrapper[4851]: I1001 14:19:19.778268 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfjk5" event={"ID":"4c696af7-ff23-4519-b874-e4c8f2277e88","Type":"ContainerDied","Data":"6a68229b28db775fa77c78c8f17edd28b9d565aae6a03987410e8d2ca1be764e"} Oct 01 14:19:19 crc kubenswrapper[4851]: I1001 14:19:19.778296 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a68229b28db775fa77c78c8f17edd28b9d565aae6a03987410e8d2ca1be764e" Oct 01 14:19:19 crc kubenswrapper[4851]: I1001 14:19:19.822291 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:19 crc kubenswrapper[4851]: I1001 14:19:19.880203 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c696af7-ff23-4519-b874-e4c8f2277e88-catalog-content\") pod \"4c696af7-ff23-4519-b874-e4c8f2277e88\" (UID: \"4c696af7-ff23-4519-b874-e4c8f2277e88\") " Oct 01 14:19:19 crc kubenswrapper[4851]: I1001 14:19:19.880291 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c696af7-ff23-4519-b874-e4c8f2277e88-utilities\") pod \"4c696af7-ff23-4519-b874-e4c8f2277e88\" (UID: \"4c696af7-ff23-4519-b874-e4c8f2277e88\") " Oct 01 14:19:19 crc kubenswrapper[4851]: I1001 14:19:19.880526 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srm25\" (UniqueName: \"kubernetes.io/projected/4c696af7-ff23-4519-b874-e4c8f2277e88-kube-api-access-srm25\") pod \"4c696af7-ff23-4519-b874-e4c8f2277e88\" (UID: \"4c696af7-ff23-4519-b874-e4c8f2277e88\") " Oct 01 14:19:19 crc kubenswrapper[4851]: I1001 14:19:19.882533 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c696af7-ff23-4519-b874-e4c8f2277e88-utilities" (OuterVolumeSpecName: "utilities") pod "4c696af7-ff23-4519-b874-e4c8f2277e88" (UID: "4c696af7-ff23-4519-b874-e4c8f2277e88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:19:19 crc kubenswrapper[4851]: I1001 14:19:19.902636 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c696af7-ff23-4519-b874-e4c8f2277e88-kube-api-access-srm25" (OuterVolumeSpecName: "kube-api-access-srm25") pod "4c696af7-ff23-4519-b874-e4c8f2277e88" (UID: "4c696af7-ff23-4519-b874-e4c8f2277e88"). InnerVolumeSpecName "kube-api-access-srm25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:19:19 crc kubenswrapper[4851]: I1001 14:19:19.929927 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c696af7-ff23-4519-b874-e4c8f2277e88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c696af7-ff23-4519-b874-e4c8f2277e88" (UID: "4c696af7-ff23-4519-b874-e4c8f2277e88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:19:19 crc kubenswrapper[4851]: I1001 14:19:19.987971 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srm25\" (UniqueName: \"kubernetes.io/projected/4c696af7-ff23-4519-b874-e4c8f2277e88-kube-api-access-srm25\") on node \"crc\" DevicePath \"\"" Oct 01 14:19:19 crc kubenswrapper[4851]: I1001 14:19:19.988006 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c696af7-ff23-4519-b874-e4c8f2277e88-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:19:19 crc kubenswrapper[4851]: I1001 14:19:19.988015 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c696af7-ff23-4519-b874-e4c8f2277e88-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:19:20 crc kubenswrapper[4851]: I1001 14:19:20.788682 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfjk5" Oct 01 14:19:20 crc kubenswrapper[4851]: I1001 14:19:20.829337 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfjk5"] Oct 01 14:19:20 crc kubenswrapper[4851]: I1001 14:19:20.839608 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfjk5"] Oct 01 14:19:22 crc kubenswrapper[4851]: I1001 14:19:22.338960 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c696af7-ff23-4519-b874-e4c8f2277e88" path="/var/lib/kubelet/pods/4c696af7-ff23-4519-b874-e4c8f2277e88/volumes" Oct 01 14:19:28 crc kubenswrapper[4851]: I1001 14:19:28.329225 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:19:28 crc kubenswrapper[4851]: E1001 14:19:28.329986 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:19:42 crc kubenswrapper[4851]: I1001 14:19:42.336650 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:19:42 crc kubenswrapper[4851]: E1001 14:19:42.337762 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:19:57 crc kubenswrapper[4851]: I1001 14:19:57.332732 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:19:57 crc kubenswrapper[4851]: E1001 14:19:57.334228 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:20:12 crc kubenswrapper[4851]: I1001 14:20:12.336993 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:20:12 crc kubenswrapper[4851]: E1001 14:20:12.338123 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:20:27 crc kubenswrapper[4851]: I1001 14:20:27.328909 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:20:27 crc kubenswrapper[4851]: E1001 14:20:27.329917 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:20:40 crc kubenswrapper[4851]: I1001 14:20:40.328406 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:20:40 crc kubenswrapper[4851]: E1001 14:20:40.329205 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:20:54 crc kubenswrapper[4851]: I1001 14:20:54.329030 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:20:54 crc kubenswrapper[4851]: E1001 14:20:54.329824 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:21:08 crc kubenswrapper[4851]: I1001 14:21:08.329199 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:21:08 crc kubenswrapper[4851]: E1001 14:21:08.330107 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:21:20 crc kubenswrapper[4851]: I1001 14:21:20.329467 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:21:20 crc kubenswrapper[4851]: E1001 14:21:20.330442 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:21:32 crc kubenswrapper[4851]: I1001 14:21:32.342469 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:21:32 crc kubenswrapper[4851]: E1001 14:21:32.343539 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:21:45 crc kubenswrapper[4851]: I1001 14:21:45.328205 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:21:45 crc kubenswrapper[4851]: E1001 14:21:45.330222 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:22:00 crc kubenswrapper[4851]: I1001 14:22:00.329673 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:22:00 crc kubenswrapper[4851]: E1001 14:22:00.331012 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:22:13 crc kubenswrapper[4851]: I1001 14:22:13.328331 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:22:13 crc kubenswrapper[4851]: E1001 14:22:13.329443 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:22:27 crc kubenswrapper[4851]: I1001 14:22:27.329148 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:22:27 crc kubenswrapper[4851]: E1001 14:22:27.330119 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:22:40 crc kubenswrapper[4851]: I1001 14:22:40.328731 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:22:41 crc kubenswrapper[4851]: I1001 14:22:41.174077 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"62dc185b893ee254e9bb04d2798c38e7519b81950796ff1e7b4054024891240d"} Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.025290 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dctgm"] Oct 01 14:23:03 crc kubenswrapper[4851]: E1001 14:23:03.027612 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c696af7-ff23-4519-b874-e4c8f2277e88" containerName="registry-server" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.027639 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c696af7-ff23-4519-b874-e4c8f2277e88" containerName="registry-server" Oct 01 14:23:03 crc kubenswrapper[4851]: E1001 14:23:03.027669 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c696af7-ff23-4519-b874-e4c8f2277e88" containerName="extract-content" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.027681 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c696af7-ff23-4519-b874-e4c8f2277e88" containerName="extract-content" Oct 01 14:23:03 crc kubenswrapper[4851]: E1001 14:23:03.027723 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c696af7-ff23-4519-b874-e4c8f2277e88" containerName="extract-utilities" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.027740 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c696af7-ff23-4519-b874-e4c8f2277e88" containerName="extract-utilities" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.028097 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c696af7-ff23-4519-b874-e4c8f2277e88" containerName="registry-server" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.030552 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.045313 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dctgm"] Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.174916 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc04f0a-7081-42bc-aed9-77ba9b383d7a-utilities\") pod \"redhat-operators-dctgm\" (UID: \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\") " pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.174967 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc04f0a-7081-42bc-aed9-77ba9b383d7a-catalog-content\") pod \"redhat-operators-dctgm\" (UID: \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\") " pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.175012 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkwpt\" (UniqueName: \"kubernetes.io/projected/abc04f0a-7081-42bc-aed9-77ba9b383d7a-kube-api-access-vkwpt\") pod \"redhat-operators-dctgm\" (UID: \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\") " pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.278844 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc04f0a-7081-42bc-aed9-77ba9b383d7a-utilities\") pod \"redhat-operators-dctgm\" (UID: \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\") " pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.278891 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc04f0a-7081-42bc-aed9-77ba9b383d7a-catalog-content\") pod \"redhat-operators-dctgm\" (UID: \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\") " pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.278923 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkwpt\" (UniqueName: \"kubernetes.io/projected/abc04f0a-7081-42bc-aed9-77ba9b383d7a-kube-api-access-vkwpt\") pod \"redhat-operators-dctgm\" (UID: \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\") " pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.279672 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc04f0a-7081-42bc-aed9-77ba9b383d7a-catalog-content\") pod \"redhat-operators-dctgm\" (UID: \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\") " pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.279863 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc04f0a-7081-42bc-aed9-77ba9b383d7a-utilities\") pod \"redhat-operators-dctgm\" (UID: \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\") " pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.594726 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkwpt\" (UniqueName: \"kubernetes.io/projected/abc04f0a-7081-42bc-aed9-77ba9b383d7a-kube-api-access-vkwpt\") pod \"redhat-operators-dctgm\" (UID: \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\") " pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:03 crc kubenswrapper[4851]: I1001 14:23:03.669729 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:04 crc kubenswrapper[4851]: I1001 14:23:04.274952 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dctgm"] Oct 01 14:23:04 crc kubenswrapper[4851]: I1001 14:23:04.445743 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dctgm" event={"ID":"abc04f0a-7081-42bc-aed9-77ba9b383d7a","Type":"ContainerStarted","Data":"448c133d96fa0af1f410ee0835d8b00adffe12b8b69421a0ec86f178e3328aa5"} Oct 01 14:23:05 crc kubenswrapper[4851]: I1001 14:23:05.459231 4851 generic.go:334] "Generic (PLEG): container finished" podID="abc04f0a-7081-42bc-aed9-77ba9b383d7a" containerID="493198e0f96029af62bb0c12e0ab826156883e1828d73550e8c648b523e032b2" exitCode=0 Oct 01 14:23:05 crc kubenswrapper[4851]: I1001 14:23:05.459532 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dctgm" event={"ID":"abc04f0a-7081-42bc-aed9-77ba9b383d7a","Type":"ContainerDied","Data":"493198e0f96029af62bb0c12e0ab826156883e1828d73550e8c648b523e032b2"} Oct 01 14:23:07 crc kubenswrapper[4851]: I1001 14:23:07.483123 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dctgm" event={"ID":"abc04f0a-7081-42bc-aed9-77ba9b383d7a","Type":"ContainerStarted","Data":"c2475a01436f190b0ed52951a92a26d756f31b43e2357a99942c2798efea3cef"} Oct 01 14:23:08 crc kubenswrapper[4851]: I1001 14:23:08.497518 4851 generic.go:334] "Generic (PLEG): container finished" podID="abc04f0a-7081-42bc-aed9-77ba9b383d7a" containerID="c2475a01436f190b0ed52951a92a26d756f31b43e2357a99942c2798efea3cef" exitCode=0 Oct 01 14:23:08 crc kubenswrapper[4851]: I1001 14:23:08.497631 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dctgm" event={"ID":"abc04f0a-7081-42bc-aed9-77ba9b383d7a","Type":"ContainerDied","Data":"c2475a01436f190b0ed52951a92a26d756f31b43e2357a99942c2798efea3cef"} Oct 01 14:23:10 crc kubenswrapper[4851]: I1001 14:23:10.530163 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dctgm" event={"ID":"abc04f0a-7081-42bc-aed9-77ba9b383d7a","Type":"ContainerStarted","Data":"753721d6878d489aab117664c19fa4ed4cfe0215c6ed90152c5fe2b97e381054"} Oct 01 14:23:10 crc kubenswrapper[4851]: I1001 14:23:10.560364 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dctgm" podStartSLOduration=4.552665948 podStartE2EDuration="8.560338686s" podCreationTimestamp="2025-10-01 14:23:02 +0000 UTC" firstStartedPulling="2025-10-01 14:23:05.461772669 +0000 UTC m=+5393.806890165" lastFinishedPulling="2025-10-01 14:23:09.469445417 +0000 UTC m=+5397.814562903" observedRunningTime="2025-10-01 14:23:10.556485656 +0000 UTC m=+5398.901603172" watchObservedRunningTime="2025-10-01 14:23:10.560338686 +0000 UTC m=+5398.905456212" Oct 01 14:23:13 crc kubenswrapper[4851]: I1001 14:23:13.670758 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:13 crc kubenswrapper[4851]: I1001 14:23:13.671372 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:14 crc kubenswrapper[4851]: I1001 14:23:14.732241 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dctgm" podUID="abc04f0a-7081-42bc-aed9-77ba9b383d7a" containerName="registry-server" probeResult="failure" output=< Oct 01 14:23:14 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 14:23:14 crc kubenswrapper[4851]: > Oct 01 14:23:23 crc kubenswrapper[4851]: I1001 14:23:23.731421 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:23 crc kubenswrapper[4851]: I1001 14:23:23.810685 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:23 crc kubenswrapper[4851]: I1001 14:23:23.982145 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dctgm"] Oct 01 14:23:25 crc kubenswrapper[4851]: I1001 14:23:25.721644 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dctgm" podUID="abc04f0a-7081-42bc-aed9-77ba9b383d7a" containerName="registry-server" containerID="cri-o://753721d6878d489aab117664c19fa4ed4cfe0215c6ed90152c5fe2b97e381054" gracePeriod=2 Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.280380 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.440396 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc04f0a-7081-42bc-aed9-77ba9b383d7a-catalog-content\") pod \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\" (UID: \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\") " Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.440930 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkwpt\" (UniqueName: \"kubernetes.io/projected/abc04f0a-7081-42bc-aed9-77ba9b383d7a-kube-api-access-vkwpt\") pod \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\" (UID: \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\") " Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.440991 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc04f0a-7081-42bc-aed9-77ba9b383d7a-utilities\") pod \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\" (UID: \"abc04f0a-7081-42bc-aed9-77ba9b383d7a\") " Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.441743 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abc04f0a-7081-42bc-aed9-77ba9b383d7a-utilities" (OuterVolumeSpecName: "utilities") pod "abc04f0a-7081-42bc-aed9-77ba9b383d7a" (UID: "abc04f0a-7081-42bc-aed9-77ba9b383d7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.445810 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc04f0a-7081-42bc-aed9-77ba9b383d7a-kube-api-access-vkwpt" (OuterVolumeSpecName: "kube-api-access-vkwpt") pod "abc04f0a-7081-42bc-aed9-77ba9b383d7a" (UID: "abc04f0a-7081-42bc-aed9-77ba9b383d7a"). InnerVolumeSpecName "kube-api-access-vkwpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.542929 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkwpt\" (UniqueName: \"kubernetes.io/projected/abc04f0a-7081-42bc-aed9-77ba9b383d7a-kube-api-access-vkwpt\") on node \"crc\" DevicePath \"\"" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.542960 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc04f0a-7081-42bc-aed9-77ba9b383d7a-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.570739 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abc04f0a-7081-42bc-aed9-77ba9b383d7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abc04f0a-7081-42bc-aed9-77ba9b383d7a" (UID: "abc04f0a-7081-42bc-aed9-77ba9b383d7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.646076 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc04f0a-7081-42bc-aed9-77ba9b383d7a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.740914 4851 generic.go:334] "Generic (PLEG): container finished" podID="abc04f0a-7081-42bc-aed9-77ba9b383d7a" containerID="753721d6878d489aab117664c19fa4ed4cfe0215c6ed90152c5fe2b97e381054" exitCode=0 Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.741012 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dctgm" event={"ID":"abc04f0a-7081-42bc-aed9-77ba9b383d7a","Type":"ContainerDied","Data":"753721d6878d489aab117664c19fa4ed4cfe0215c6ed90152c5fe2b97e381054"} Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.741097 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dctgm" event={"ID":"abc04f0a-7081-42bc-aed9-77ba9b383d7a","Type":"ContainerDied","Data":"448c133d96fa0af1f410ee0835d8b00adffe12b8b69421a0ec86f178e3328aa5"} Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.741138 4851 scope.go:117] "RemoveContainer" containerID="753721d6878d489aab117664c19fa4ed4cfe0215c6ed90152c5fe2b97e381054" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.741036 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dctgm" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.783723 4851 scope.go:117] "RemoveContainer" containerID="c2475a01436f190b0ed52951a92a26d756f31b43e2357a99942c2798efea3cef" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.809672 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dctgm"] Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.821954 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dctgm"] Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.823435 4851 scope.go:117] "RemoveContainer" containerID="493198e0f96029af62bb0c12e0ab826156883e1828d73550e8c648b523e032b2" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.931933 4851 scope.go:117] "RemoveContainer" containerID="753721d6878d489aab117664c19fa4ed4cfe0215c6ed90152c5fe2b97e381054" Oct 01 14:23:26 crc kubenswrapper[4851]: E1001 14:23:26.932333 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"753721d6878d489aab117664c19fa4ed4cfe0215c6ed90152c5fe2b97e381054\": container with ID starting with 753721d6878d489aab117664c19fa4ed4cfe0215c6ed90152c5fe2b97e381054 not found: ID does not exist" containerID="753721d6878d489aab117664c19fa4ed4cfe0215c6ed90152c5fe2b97e381054" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.932362 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753721d6878d489aab117664c19fa4ed4cfe0215c6ed90152c5fe2b97e381054"} err="failed to get container status \"753721d6878d489aab117664c19fa4ed4cfe0215c6ed90152c5fe2b97e381054\": rpc error: code = NotFound desc = could not find container \"753721d6878d489aab117664c19fa4ed4cfe0215c6ed90152c5fe2b97e381054\": container with ID starting with 753721d6878d489aab117664c19fa4ed4cfe0215c6ed90152c5fe2b97e381054 not found: ID does not exist" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.932382 4851 scope.go:117] "RemoveContainer" containerID="c2475a01436f190b0ed52951a92a26d756f31b43e2357a99942c2798efea3cef" Oct 01 14:23:26 crc kubenswrapper[4851]: E1001 14:23:26.932759 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2475a01436f190b0ed52951a92a26d756f31b43e2357a99942c2798efea3cef\": container with ID starting with c2475a01436f190b0ed52951a92a26d756f31b43e2357a99942c2798efea3cef not found: ID does not exist" containerID="c2475a01436f190b0ed52951a92a26d756f31b43e2357a99942c2798efea3cef" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.932882 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2475a01436f190b0ed52951a92a26d756f31b43e2357a99942c2798efea3cef"} err="failed to get container status \"c2475a01436f190b0ed52951a92a26d756f31b43e2357a99942c2798efea3cef\": rpc error: code = NotFound desc = could not find container \"c2475a01436f190b0ed52951a92a26d756f31b43e2357a99942c2798efea3cef\": container with ID starting with c2475a01436f190b0ed52951a92a26d756f31b43e2357a99942c2798efea3cef not found: ID does not exist" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.932964 4851 scope.go:117] "RemoveContainer" containerID="493198e0f96029af62bb0c12e0ab826156883e1828d73550e8c648b523e032b2" Oct 01 14:23:26 crc kubenswrapper[4851]: E1001 14:23:26.933390 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"493198e0f96029af62bb0c12e0ab826156883e1828d73550e8c648b523e032b2\": container with ID starting with 493198e0f96029af62bb0c12e0ab826156883e1828d73550e8c648b523e032b2 not found: ID does not exist" containerID="493198e0f96029af62bb0c12e0ab826156883e1828d73550e8c648b523e032b2" Oct 01 14:23:26 crc kubenswrapper[4851]: I1001 14:23:26.933425 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"493198e0f96029af62bb0c12e0ab826156883e1828d73550e8c648b523e032b2"} err="failed to get container status \"493198e0f96029af62bb0c12e0ab826156883e1828d73550e8c648b523e032b2\": rpc error: code = NotFound desc = could not find container \"493198e0f96029af62bb0c12e0ab826156883e1828d73550e8c648b523e032b2\": container with ID starting with 493198e0f96029af62bb0c12e0ab826156883e1828d73550e8c648b523e032b2 not found: ID does not exist" Oct 01 14:23:28 crc kubenswrapper[4851]: I1001 14:23:28.341077 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abc04f0a-7081-42bc-aed9-77ba9b383d7a" path="/var/lib/kubelet/pods/abc04f0a-7081-42bc-aed9-77ba9b383d7a/volumes" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.564348 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j4dkp"] Oct 01 14:23:49 crc kubenswrapper[4851]: E1001 14:23:49.565989 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc04f0a-7081-42bc-aed9-77ba9b383d7a" containerName="extract-utilities" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.566010 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc04f0a-7081-42bc-aed9-77ba9b383d7a" containerName="extract-utilities" Oct 01 14:23:49 crc kubenswrapper[4851]: E1001 14:23:49.566045 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc04f0a-7081-42bc-aed9-77ba9b383d7a" containerName="registry-server" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.566055 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc04f0a-7081-42bc-aed9-77ba9b383d7a" containerName="registry-server" Oct 01 14:23:49 crc kubenswrapper[4851]: E1001 14:23:49.566083 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc04f0a-7081-42bc-aed9-77ba9b383d7a" containerName="extract-content" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.566094 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc04f0a-7081-42bc-aed9-77ba9b383d7a" containerName="extract-content" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.566457 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc04f0a-7081-42bc-aed9-77ba9b383d7a" containerName="registry-server" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.568597 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.594529 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j4dkp"] Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.685755 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030ccd3e-3374-4d21-93d5-751527d42b6c-catalog-content\") pod \"certified-operators-j4dkp\" (UID: \"030ccd3e-3374-4d21-93d5-751527d42b6c\") " pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.686170 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030ccd3e-3374-4d21-93d5-751527d42b6c-utilities\") pod \"certified-operators-j4dkp\" (UID: \"030ccd3e-3374-4d21-93d5-751527d42b6c\") " pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.686257 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr2ts\" (UniqueName: \"kubernetes.io/projected/030ccd3e-3374-4d21-93d5-751527d42b6c-kube-api-access-nr2ts\") pod \"certified-operators-j4dkp\" (UID: \"030ccd3e-3374-4d21-93d5-751527d42b6c\") " pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.787736 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030ccd3e-3374-4d21-93d5-751527d42b6c-utilities\") pod \"certified-operators-j4dkp\" (UID: \"030ccd3e-3374-4d21-93d5-751527d42b6c\") " pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.787801 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr2ts\" (UniqueName: \"kubernetes.io/projected/030ccd3e-3374-4d21-93d5-751527d42b6c-kube-api-access-nr2ts\") pod \"certified-operators-j4dkp\" (UID: \"030ccd3e-3374-4d21-93d5-751527d42b6c\") " pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.787882 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030ccd3e-3374-4d21-93d5-751527d42b6c-catalog-content\") pod \"certified-operators-j4dkp\" (UID: \"030ccd3e-3374-4d21-93d5-751527d42b6c\") " pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.788574 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030ccd3e-3374-4d21-93d5-751527d42b6c-catalog-content\") pod \"certified-operators-j4dkp\" (UID: \"030ccd3e-3374-4d21-93d5-751527d42b6c\") " pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.788684 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030ccd3e-3374-4d21-93d5-751527d42b6c-utilities\") pod \"certified-operators-j4dkp\" (UID: \"030ccd3e-3374-4d21-93d5-751527d42b6c\") " pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.814937 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr2ts\" (UniqueName: \"kubernetes.io/projected/030ccd3e-3374-4d21-93d5-751527d42b6c-kube-api-access-nr2ts\") pod \"certified-operators-j4dkp\" (UID: \"030ccd3e-3374-4d21-93d5-751527d42b6c\") " pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:23:49 crc kubenswrapper[4851]: I1001 14:23:49.912377 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:23:50 crc kubenswrapper[4851]: I1001 14:23:50.498663 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j4dkp"] Oct 01 14:23:52 crc kubenswrapper[4851]: I1001 14:23:52.052807 4851 generic.go:334] "Generic (PLEG): container finished" podID="030ccd3e-3374-4d21-93d5-751527d42b6c" containerID="5f50771d6008a4bcf3395b5cd1da9e19746e500982beaa165ca3f798afb94f53" exitCode=0 Oct 01 14:23:52 crc kubenswrapper[4851]: I1001 14:23:52.053006 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4dkp" event={"ID":"030ccd3e-3374-4d21-93d5-751527d42b6c","Type":"ContainerDied","Data":"5f50771d6008a4bcf3395b5cd1da9e19746e500982beaa165ca3f798afb94f53"} Oct 01 14:23:52 crc kubenswrapper[4851]: I1001 14:23:52.053108 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4dkp" event={"ID":"030ccd3e-3374-4d21-93d5-751527d42b6c","Type":"ContainerStarted","Data":"4848e09338b4d37daccb9cd95fbdad4340aebccbe8e4877bfca2e55bf5d257ac"} Oct 01 14:23:52 crc kubenswrapper[4851]: I1001 14:23:52.055608 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:23:54 crc kubenswrapper[4851]: I1001 14:23:54.074675 4851 generic.go:334] "Generic (PLEG): container finished" podID="030ccd3e-3374-4d21-93d5-751527d42b6c" containerID="fea94955a41cd34af013e6e62933a19eb73e31f465945b67b56ee8eab565e38c" exitCode=0 Oct 01 14:23:54 crc kubenswrapper[4851]: I1001 14:23:54.074756 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4dkp" event={"ID":"030ccd3e-3374-4d21-93d5-751527d42b6c","Type":"ContainerDied","Data":"fea94955a41cd34af013e6e62933a19eb73e31f465945b67b56ee8eab565e38c"} Oct 01 14:23:55 crc kubenswrapper[4851]: I1001 14:23:55.091888 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4dkp" event={"ID":"030ccd3e-3374-4d21-93d5-751527d42b6c","Type":"ContainerStarted","Data":"f9d316e010e25979c743a9bfc9c6e2bc1a8d0467ccd67cf7f427d561169d30f5"} Oct 01 14:23:55 crc kubenswrapper[4851]: I1001 14:23:55.120572 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j4dkp" podStartSLOduration=3.634342766 podStartE2EDuration="6.12055046s" podCreationTimestamp="2025-10-01 14:23:49 +0000 UTC" firstStartedPulling="2025-10-01 14:23:52.055257463 +0000 UTC m=+5440.400374969" lastFinishedPulling="2025-10-01 14:23:54.541465157 +0000 UTC m=+5442.886582663" observedRunningTime="2025-10-01 14:23:55.117860313 +0000 UTC m=+5443.462977819" watchObservedRunningTime="2025-10-01 14:23:55.12055046 +0000 UTC m=+5443.465667956" Oct 01 14:23:59 crc kubenswrapper[4851]: I1001 14:23:59.913535 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:23:59 crc kubenswrapper[4851]: I1001 14:23:59.914171 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:23:59 crc kubenswrapper[4851]: I1001 14:23:59.973158 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:24:00 crc kubenswrapper[4851]: I1001 14:24:00.193579 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:24:00 crc kubenswrapper[4851]: I1001 14:24:00.240981 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j4dkp"] Oct 01 14:24:02 crc kubenswrapper[4851]: I1001 14:24:02.162255 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j4dkp" podUID="030ccd3e-3374-4d21-93d5-751527d42b6c" containerName="registry-server" containerID="cri-o://f9d316e010e25979c743a9bfc9c6e2bc1a8d0467ccd67cf7f427d561169d30f5" gracePeriod=2 Oct 01 14:24:02 crc kubenswrapper[4851]: I1001 14:24:02.988799 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.009639 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr2ts\" (UniqueName: \"kubernetes.io/projected/030ccd3e-3374-4d21-93d5-751527d42b6c-kube-api-access-nr2ts\") pod \"030ccd3e-3374-4d21-93d5-751527d42b6c\" (UID: \"030ccd3e-3374-4d21-93d5-751527d42b6c\") " Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.009756 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030ccd3e-3374-4d21-93d5-751527d42b6c-utilities\") pod \"030ccd3e-3374-4d21-93d5-751527d42b6c\" (UID: \"030ccd3e-3374-4d21-93d5-751527d42b6c\") " Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.009814 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030ccd3e-3374-4d21-93d5-751527d42b6c-catalog-content\") pod \"030ccd3e-3374-4d21-93d5-751527d42b6c\" (UID: \"030ccd3e-3374-4d21-93d5-751527d42b6c\") " Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.011229 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/030ccd3e-3374-4d21-93d5-751527d42b6c-utilities" (OuterVolumeSpecName: "utilities") pod "030ccd3e-3374-4d21-93d5-751527d42b6c" (UID: "030ccd3e-3374-4d21-93d5-751527d42b6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.019354 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/030ccd3e-3374-4d21-93d5-751527d42b6c-kube-api-access-nr2ts" (OuterVolumeSpecName: "kube-api-access-nr2ts") pod "030ccd3e-3374-4d21-93d5-751527d42b6c" (UID: "030ccd3e-3374-4d21-93d5-751527d42b6c"). InnerVolumeSpecName "kube-api-access-nr2ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.057833 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/030ccd3e-3374-4d21-93d5-751527d42b6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "030ccd3e-3374-4d21-93d5-751527d42b6c" (UID: "030ccd3e-3374-4d21-93d5-751527d42b6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.112155 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr2ts\" (UniqueName: \"kubernetes.io/projected/030ccd3e-3374-4d21-93d5-751527d42b6c-kube-api-access-nr2ts\") on node \"crc\" DevicePath \"\"" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.112479 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030ccd3e-3374-4d21-93d5-751527d42b6c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.112515 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030ccd3e-3374-4d21-93d5-751527d42b6c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.176245 4851 generic.go:334] "Generic (PLEG): container finished" podID="030ccd3e-3374-4d21-93d5-751527d42b6c" containerID="f9d316e010e25979c743a9bfc9c6e2bc1a8d0467ccd67cf7f427d561169d30f5" exitCode=0 Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.176295 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4dkp" event={"ID":"030ccd3e-3374-4d21-93d5-751527d42b6c","Type":"ContainerDied","Data":"f9d316e010e25979c743a9bfc9c6e2bc1a8d0467ccd67cf7f427d561169d30f5"} Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.176317 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j4dkp" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.176338 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4dkp" event={"ID":"030ccd3e-3374-4d21-93d5-751527d42b6c","Type":"ContainerDied","Data":"4848e09338b4d37daccb9cd95fbdad4340aebccbe8e4877bfca2e55bf5d257ac"} Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.176358 4851 scope.go:117] "RemoveContainer" containerID="f9d316e010e25979c743a9bfc9c6e2bc1a8d0467ccd67cf7f427d561169d30f5" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.209262 4851 scope.go:117] "RemoveContainer" containerID="fea94955a41cd34af013e6e62933a19eb73e31f465945b67b56ee8eab565e38c" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.216624 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j4dkp"] Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.224970 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j4dkp"] Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.232836 4851 scope.go:117] "RemoveContainer" containerID="5f50771d6008a4bcf3395b5cd1da9e19746e500982beaa165ca3f798afb94f53" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.303178 4851 scope.go:117] "RemoveContainer" containerID="f9d316e010e25979c743a9bfc9c6e2bc1a8d0467ccd67cf7f427d561169d30f5" Oct 01 14:24:03 crc kubenswrapper[4851]: E1001 14:24:03.303676 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d316e010e25979c743a9bfc9c6e2bc1a8d0467ccd67cf7f427d561169d30f5\": container with ID starting with f9d316e010e25979c743a9bfc9c6e2bc1a8d0467ccd67cf7f427d561169d30f5 not found: ID does not exist" containerID="f9d316e010e25979c743a9bfc9c6e2bc1a8d0467ccd67cf7f427d561169d30f5" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.303707 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d316e010e25979c743a9bfc9c6e2bc1a8d0467ccd67cf7f427d561169d30f5"} err="failed to get container status \"f9d316e010e25979c743a9bfc9c6e2bc1a8d0467ccd67cf7f427d561169d30f5\": rpc error: code = NotFound desc = could not find container \"f9d316e010e25979c743a9bfc9c6e2bc1a8d0467ccd67cf7f427d561169d30f5\": container with ID starting with f9d316e010e25979c743a9bfc9c6e2bc1a8d0467ccd67cf7f427d561169d30f5 not found: ID does not exist" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.303725 4851 scope.go:117] "RemoveContainer" containerID="fea94955a41cd34af013e6e62933a19eb73e31f465945b67b56ee8eab565e38c" Oct 01 14:24:03 crc kubenswrapper[4851]: E1001 14:24:03.304200 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea94955a41cd34af013e6e62933a19eb73e31f465945b67b56ee8eab565e38c\": container with ID starting with fea94955a41cd34af013e6e62933a19eb73e31f465945b67b56ee8eab565e38c not found: ID does not exist" containerID="fea94955a41cd34af013e6e62933a19eb73e31f465945b67b56ee8eab565e38c" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.304286 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea94955a41cd34af013e6e62933a19eb73e31f465945b67b56ee8eab565e38c"} err="failed to get container status \"fea94955a41cd34af013e6e62933a19eb73e31f465945b67b56ee8eab565e38c\": rpc error: code = NotFound desc = could not find container \"fea94955a41cd34af013e6e62933a19eb73e31f465945b67b56ee8eab565e38c\": container with ID starting with fea94955a41cd34af013e6e62933a19eb73e31f465945b67b56ee8eab565e38c not found: ID does not exist" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.304320 4851 scope.go:117] "RemoveContainer" containerID="5f50771d6008a4bcf3395b5cd1da9e19746e500982beaa165ca3f798afb94f53" Oct 01 14:24:03 crc kubenswrapper[4851]: E1001 14:24:03.304597 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f50771d6008a4bcf3395b5cd1da9e19746e500982beaa165ca3f798afb94f53\": container with ID starting with 5f50771d6008a4bcf3395b5cd1da9e19746e500982beaa165ca3f798afb94f53 not found: ID does not exist" containerID="5f50771d6008a4bcf3395b5cd1da9e19746e500982beaa165ca3f798afb94f53" Oct 01 14:24:03 crc kubenswrapper[4851]: I1001 14:24:03.304617 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f50771d6008a4bcf3395b5cd1da9e19746e500982beaa165ca3f798afb94f53"} err="failed to get container status \"5f50771d6008a4bcf3395b5cd1da9e19746e500982beaa165ca3f798afb94f53\": rpc error: code = NotFound desc = could not find container \"5f50771d6008a4bcf3395b5cd1da9e19746e500982beaa165ca3f798afb94f53\": container with ID starting with 5f50771d6008a4bcf3395b5cd1da9e19746e500982beaa165ca3f798afb94f53 not found: ID does not exist" Oct 01 14:24:04 crc kubenswrapper[4851]: I1001 14:24:04.341390 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="030ccd3e-3374-4d21-93d5-751527d42b6c" path="/var/lib/kubelet/pods/030ccd3e-3374-4d21-93d5-751527d42b6c/volumes" Oct 01 14:25:00 crc kubenswrapper[4851]: I1001 14:25:00.050025 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:25:00 crc kubenswrapper[4851]: I1001 14:25:00.050825 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:25:20 crc kubenswrapper[4851]: I1001 14:25:20.307233 4851 scope.go:117] "RemoveContainer" containerID="a8685c523d616712a5c805391556b6fcaf94f16fe9bf23af9e134c9c9bc7e7f7" Oct 01 14:25:20 crc kubenswrapper[4851]: I1001 14:25:20.344607 4851 scope.go:117] "RemoveContainer" containerID="cbe77c052337c04226f0c2f0c7c348addd828538a06481b1a0dcfa79b503bfb4" Oct 01 14:25:20 crc kubenswrapper[4851]: I1001 14:25:20.379938 4851 scope.go:117] "RemoveContainer" containerID="d60c21a8cd9f1ab95a71874cb40bba5d8d078771c04524d7a69c679ab594b98a" Oct 01 14:25:20 crc kubenswrapper[4851]: I1001 14:25:20.420680 4851 scope.go:117] "RemoveContainer" containerID="86e332c7aeee240556b358b54dd63413283d5ddf05980a3144f87cf0e9f680dd" Oct 01 14:25:20 crc kubenswrapper[4851]: I1001 14:25:20.467575 4851 scope.go:117] "RemoveContainer" containerID="77be14ce5489cb70c215a626703fffb5e29690f8cbc3e78661e0aed4acec0fb3" Oct 01 14:25:20 crc kubenswrapper[4851]: I1001 14:25:20.515419 4851 scope.go:117] "RemoveContainer" containerID="5a4ab9ada0bb53412af52eff4126f7d2b3644db7d1503c208abe507a104c7153" Oct 01 14:25:30 crc kubenswrapper[4851]: I1001 14:25:30.050802 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:25:30 crc kubenswrapper[4851]: I1001 14:25:30.051396 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:26:00 crc kubenswrapper[4851]: I1001 14:26:00.050741 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:26:00 crc kubenswrapper[4851]: I1001 14:26:00.052961 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:26:00 crc kubenswrapper[4851]: I1001 14:26:00.053205 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 14:26:00 crc kubenswrapper[4851]: I1001 14:26:00.054624 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62dc185b893ee254e9bb04d2798c38e7519b81950796ff1e7b4054024891240d"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:26:00 crc kubenswrapper[4851]: I1001 14:26:00.054894 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://62dc185b893ee254e9bb04d2798c38e7519b81950796ff1e7b4054024891240d" gracePeriod=600 Oct 01 14:26:00 crc kubenswrapper[4851]: I1001 14:26:00.567809 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="62dc185b893ee254e9bb04d2798c38e7519b81950796ff1e7b4054024891240d" exitCode=0 Oct 01 14:26:00 crc kubenswrapper[4851]: I1001 14:26:00.567907 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"62dc185b893ee254e9bb04d2798c38e7519b81950796ff1e7b4054024891240d"} Oct 01 14:26:00 crc kubenswrapper[4851]: I1001 14:26:00.567994 4851 scope.go:117] "RemoveContainer" containerID="f3451b443c784dba3b219c782e1786500d986a6442b7778497a39b093bf0b4ed" Oct 01 14:26:01 crc kubenswrapper[4851]: I1001 14:26:01.584564 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161"} Oct 01 14:28:30 crc kubenswrapper[4851]: I1001 14:28:30.050071 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:28:30 crc kubenswrapper[4851]: I1001 14:28:30.050863 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:29:00 crc kubenswrapper[4851]: I1001 14:29:00.050041 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:29:00 crc kubenswrapper[4851]: I1001 14:29:00.050786 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:29:30 crc kubenswrapper[4851]: I1001 14:29:30.050260 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:29:30 crc kubenswrapper[4851]: I1001 14:29:30.051070 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:29:30 crc kubenswrapper[4851]: I1001 14:29:30.051165 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 14:29:30 crc kubenswrapper[4851]: I1001 14:29:30.052486 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:29:30 crc kubenswrapper[4851]: I1001 14:29:30.052809 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" gracePeriod=600 Oct 01 14:29:30 crc kubenswrapper[4851]: E1001 14:29:30.187574 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:29:30 crc kubenswrapper[4851]: I1001 14:29:30.897789 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" exitCode=0 Oct 01 14:29:30 crc kubenswrapper[4851]: I1001 14:29:30.897850 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161"} Oct 01 14:29:30 crc kubenswrapper[4851]: I1001 14:29:30.897924 4851 scope.go:117] "RemoveContainer" containerID="62dc185b893ee254e9bb04d2798c38e7519b81950796ff1e7b4054024891240d" Oct 01 14:29:30 crc kubenswrapper[4851]: I1001 14:29:30.898576 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:29:30 crc kubenswrapper[4851]: E1001 14:29:30.898927 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.431370 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-86mhl"] Oct 01 14:29:34 crc kubenswrapper[4851]: E1001 14:29:34.432278 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030ccd3e-3374-4d21-93d5-751527d42b6c" containerName="extract-utilities" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.432292 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="030ccd3e-3374-4d21-93d5-751527d42b6c" containerName="extract-utilities" Oct 01 14:29:34 crc kubenswrapper[4851]: E1001 14:29:34.432300 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030ccd3e-3374-4d21-93d5-751527d42b6c" containerName="extract-content" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.432306 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="030ccd3e-3374-4d21-93d5-751527d42b6c" containerName="extract-content" Oct 01 14:29:34 crc kubenswrapper[4851]: E1001 14:29:34.432327 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030ccd3e-3374-4d21-93d5-751527d42b6c" containerName="registry-server" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.432333 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="030ccd3e-3374-4d21-93d5-751527d42b6c" containerName="registry-server" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.432551 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="030ccd3e-3374-4d21-93d5-751527d42b6c" containerName="registry-server" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.433939 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.447087 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86mhl"] Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.582280 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/021b53ed-efda-4944-82d5-c49163bcf39c-utilities\") pod \"community-operators-86mhl\" (UID: \"021b53ed-efda-4944-82d5-c49163bcf39c\") " pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.582595 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/021b53ed-efda-4944-82d5-c49163bcf39c-catalog-content\") pod \"community-operators-86mhl\" (UID: \"021b53ed-efda-4944-82d5-c49163bcf39c\") " pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.582712 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxvzd\" (UniqueName: \"kubernetes.io/projected/021b53ed-efda-4944-82d5-c49163bcf39c-kube-api-access-rxvzd\") pod \"community-operators-86mhl\" (UID: \"021b53ed-efda-4944-82d5-c49163bcf39c\") " pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.686126 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/021b53ed-efda-4944-82d5-c49163bcf39c-catalog-content\") pod \"community-operators-86mhl\" (UID: \"021b53ed-efda-4944-82d5-c49163bcf39c\") " pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.686233 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxvzd\" (UniqueName: \"kubernetes.io/projected/021b53ed-efda-4944-82d5-c49163bcf39c-kube-api-access-rxvzd\") pod \"community-operators-86mhl\" (UID: \"021b53ed-efda-4944-82d5-c49163bcf39c\") " pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.686374 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/021b53ed-efda-4944-82d5-c49163bcf39c-utilities\") pod \"community-operators-86mhl\" (UID: \"021b53ed-efda-4944-82d5-c49163bcf39c\") " pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.687043 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/021b53ed-efda-4944-82d5-c49163bcf39c-catalog-content\") pod \"community-operators-86mhl\" (UID: \"021b53ed-efda-4944-82d5-c49163bcf39c\") " pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.687312 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/021b53ed-efda-4944-82d5-c49163bcf39c-utilities\") pod \"community-operators-86mhl\" (UID: \"021b53ed-efda-4944-82d5-c49163bcf39c\") " pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.711824 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxvzd\" (UniqueName: \"kubernetes.io/projected/021b53ed-efda-4944-82d5-c49163bcf39c-kube-api-access-rxvzd\") pod \"community-operators-86mhl\" (UID: \"021b53ed-efda-4944-82d5-c49163bcf39c\") " pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:34 crc kubenswrapper[4851]: I1001 14:29:34.768105 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:35 crc kubenswrapper[4851]: I1001 14:29:35.311397 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86mhl"] Oct 01 14:29:35 crc kubenswrapper[4851]: I1001 14:29:35.971623 4851 generic.go:334] "Generic (PLEG): container finished" podID="021b53ed-efda-4944-82d5-c49163bcf39c" containerID="501c90412aa176f74a6ba9a47241332e2d6c9365889ee5f0b137ed9ff35f9fb9" exitCode=0 Oct 01 14:29:35 crc kubenswrapper[4851]: I1001 14:29:35.971679 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86mhl" event={"ID":"021b53ed-efda-4944-82d5-c49163bcf39c","Type":"ContainerDied","Data":"501c90412aa176f74a6ba9a47241332e2d6c9365889ee5f0b137ed9ff35f9fb9"} Oct 01 14:29:35 crc kubenswrapper[4851]: I1001 14:29:35.971968 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86mhl" event={"ID":"021b53ed-efda-4944-82d5-c49163bcf39c","Type":"ContainerStarted","Data":"9810ec78855cf58d05ed8747d190818cfd32394fe4e7cdb201d841088afcdaa5"} Oct 01 14:29:35 crc kubenswrapper[4851]: I1001 14:29:35.976381 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:29:37 crc kubenswrapper[4851]: I1001 14:29:37.993169 4851 generic.go:334] "Generic (PLEG): container finished" podID="021b53ed-efda-4944-82d5-c49163bcf39c" containerID="4cd96d59a8f9777970789493d643593905d535b99c1710c6a60a7f9d03c84e75" exitCode=0 Oct 01 14:29:37 crc kubenswrapper[4851]: I1001 14:29:37.993248 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86mhl" event={"ID":"021b53ed-efda-4944-82d5-c49163bcf39c","Type":"ContainerDied","Data":"4cd96d59a8f9777970789493d643593905d535b99c1710c6a60a7f9d03c84e75"} Oct 01 14:29:40 crc kubenswrapper[4851]: I1001 14:29:40.039367 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86mhl" event={"ID":"021b53ed-efda-4944-82d5-c49163bcf39c","Type":"ContainerStarted","Data":"11d8bc883d5c47ad077422b1b44a68147209b6d6336dcaf07dab3b6cfd7ef30c"} Oct 01 14:29:40 crc kubenswrapper[4851]: I1001 14:29:40.077731 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-86mhl" podStartSLOduration=3.070142141 podStartE2EDuration="6.077716704s" podCreationTimestamp="2025-10-01 14:29:34 +0000 UTC" firstStartedPulling="2025-10-01 14:29:35.976147566 +0000 UTC m=+5784.321265052" lastFinishedPulling="2025-10-01 14:29:38.983722129 +0000 UTC m=+5787.328839615" observedRunningTime="2025-10-01 14:29:40.077101317 +0000 UTC m=+5788.422218833" watchObservedRunningTime="2025-10-01 14:29:40.077716704 +0000 UTC m=+5788.422834190" Oct 01 14:29:43 crc kubenswrapper[4851]: I1001 14:29:43.328890 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:29:43 crc kubenswrapper[4851]: E1001 14:29:43.329622 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:29:44 crc kubenswrapper[4851]: I1001 14:29:44.768202 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:44 crc kubenswrapper[4851]: I1001 14:29:44.768554 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:44 crc kubenswrapper[4851]: I1001 14:29:44.835316 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:45 crc kubenswrapper[4851]: I1001 14:29:45.150942 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:45 crc kubenswrapper[4851]: I1001 14:29:45.229859 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86mhl"] Oct 01 14:29:47 crc kubenswrapper[4851]: I1001 14:29:47.105604 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-86mhl" podUID="021b53ed-efda-4944-82d5-c49163bcf39c" containerName="registry-server" containerID="cri-o://11d8bc883d5c47ad077422b1b44a68147209b6d6336dcaf07dab3b6cfd7ef30c" gracePeriod=2 Oct 01 14:29:48 crc kubenswrapper[4851]: I1001 14:29:48.119454 4851 generic.go:334] "Generic (PLEG): container finished" podID="021b53ed-efda-4944-82d5-c49163bcf39c" containerID="11d8bc883d5c47ad077422b1b44a68147209b6d6336dcaf07dab3b6cfd7ef30c" exitCode=0 Oct 01 14:29:48 crc kubenswrapper[4851]: I1001 14:29:48.119568 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86mhl" event={"ID":"021b53ed-efda-4944-82d5-c49163bcf39c","Type":"ContainerDied","Data":"11d8bc883d5c47ad077422b1b44a68147209b6d6336dcaf07dab3b6cfd7ef30c"} Oct 01 14:29:48 crc kubenswrapper[4851]: I1001 14:29:48.120119 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86mhl" event={"ID":"021b53ed-efda-4944-82d5-c49163bcf39c","Type":"ContainerDied","Data":"9810ec78855cf58d05ed8747d190818cfd32394fe4e7cdb201d841088afcdaa5"} Oct 01 14:29:48 crc kubenswrapper[4851]: I1001 14:29:48.120145 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9810ec78855cf58d05ed8747d190818cfd32394fe4e7cdb201d841088afcdaa5" Oct 01 14:29:48 crc kubenswrapper[4851]: I1001 14:29:48.158637 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:48 crc kubenswrapper[4851]: I1001 14:29:48.289302 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/021b53ed-efda-4944-82d5-c49163bcf39c-utilities\") pod \"021b53ed-efda-4944-82d5-c49163bcf39c\" (UID: \"021b53ed-efda-4944-82d5-c49163bcf39c\") " Oct 01 14:29:48 crc kubenswrapper[4851]: I1001 14:29:48.289579 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/021b53ed-efda-4944-82d5-c49163bcf39c-catalog-content\") pod \"021b53ed-efda-4944-82d5-c49163bcf39c\" (UID: \"021b53ed-efda-4944-82d5-c49163bcf39c\") " Oct 01 14:29:48 crc kubenswrapper[4851]: I1001 14:29:48.289635 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxvzd\" (UniqueName: \"kubernetes.io/projected/021b53ed-efda-4944-82d5-c49163bcf39c-kube-api-access-rxvzd\") pod \"021b53ed-efda-4944-82d5-c49163bcf39c\" (UID: \"021b53ed-efda-4944-82d5-c49163bcf39c\") " Oct 01 14:29:48 crc kubenswrapper[4851]: I1001 14:29:48.290114 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/021b53ed-efda-4944-82d5-c49163bcf39c-utilities" (OuterVolumeSpecName: "utilities") pod "021b53ed-efda-4944-82d5-c49163bcf39c" (UID: "021b53ed-efda-4944-82d5-c49163bcf39c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:29:48 crc kubenswrapper[4851]: I1001 14:29:48.290627 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/021b53ed-efda-4944-82d5-c49163bcf39c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:29:48 crc kubenswrapper[4851]: I1001 14:29:48.301981 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/021b53ed-efda-4944-82d5-c49163bcf39c-kube-api-access-rxvzd" (OuterVolumeSpecName: "kube-api-access-rxvzd") pod "021b53ed-efda-4944-82d5-c49163bcf39c" (UID: "021b53ed-efda-4944-82d5-c49163bcf39c"). InnerVolumeSpecName "kube-api-access-rxvzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:29:48 crc kubenswrapper[4851]: I1001 14:29:48.364379 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/021b53ed-efda-4944-82d5-c49163bcf39c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "021b53ed-efda-4944-82d5-c49163bcf39c" (UID: "021b53ed-efda-4944-82d5-c49163bcf39c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:29:48 crc kubenswrapper[4851]: I1001 14:29:48.392653 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxvzd\" (UniqueName: \"kubernetes.io/projected/021b53ed-efda-4944-82d5-c49163bcf39c-kube-api-access-rxvzd\") on node \"crc\" DevicePath \"\"" Oct 01 14:29:48 crc kubenswrapper[4851]: I1001 14:29:48.392695 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/021b53ed-efda-4944-82d5-c49163bcf39c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:29:49 crc kubenswrapper[4851]: I1001 14:29:49.135597 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86mhl" Oct 01 14:29:49 crc kubenswrapper[4851]: I1001 14:29:49.183951 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86mhl"] Oct 01 14:29:49 crc kubenswrapper[4851]: I1001 14:29:49.219466 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-86mhl"] Oct 01 14:29:50 crc kubenswrapper[4851]: I1001 14:29:50.346311 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="021b53ed-efda-4944-82d5-c49163bcf39c" path="/var/lib/kubelet/pods/021b53ed-efda-4944-82d5-c49163bcf39c/volumes" Oct 01 14:29:54 crc kubenswrapper[4851]: I1001 14:29:54.331895 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:29:54 crc kubenswrapper[4851]: E1001 14:29:54.333145 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.164899 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp"] Oct 01 14:30:00 crc kubenswrapper[4851]: E1001 14:30:00.166087 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="021b53ed-efda-4944-82d5-c49163bcf39c" containerName="registry-server" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.166109 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="021b53ed-efda-4944-82d5-c49163bcf39c" containerName="registry-server" Oct 01 14:30:00 crc kubenswrapper[4851]: E1001 14:30:00.166154 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="021b53ed-efda-4944-82d5-c49163bcf39c" containerName="extract-content" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.166168 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="021b53ed-efda-4944-82d5-c49163bcf39c" containerName="extract-content" Oct 01 14:30:00 crc kubenswrapper[4851]: E1001 14:30:00.166244 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="021b53ed-efda-4944-82d5-c49163bcf39c" containerName="extract-utilities" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.166317 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="021b53ed-efda-4944-82d5-c49163bcf39c" containerName="extract-utilities" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.166755 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="021b53ed-efda-4944-82d5-c49163bcf39c" containerName="registry-server" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.168054 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.172248 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.172536 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.178484 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp"] Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.286944 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gz4f\" (UniqueName: \"kubernetes.io/projected/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-kube-api-access-7gz4f\") pod \"collect-profiles-29322150-w4jvp\" (UID: \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.287384 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-config-volume\") pod \"collect-profiles-29322150-w4jvp\" (UID: \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.287452 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-secret-volume\") pod \"collect-profiles-29322150-w4jvp\" (UID: \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.388902 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-config-volume\") pod \"collect-profiles-29322150-w4jvp\" (UID: \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.389003 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-secret-volume\") pod \"collect-profiles-29322150-w4jvp\" (UID: \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.389093 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gz4f\" (UniqueName: \"kubernetes.io/projected/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-kube-api-access-7gz4f\") pod \"collect-profiles-29322150-w4jvp\" (UID: \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.390102 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-config-volume\") pod \"collect-profiles-29322150-w4jvp\" (UID: \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.407119 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-secret-volume\") pod \"collect-profiles-29322150-w4jvp\" (UID: \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.414745 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gz4f\" (UniqueName: \"kubernetes.io/projected/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-kube-api-access-7gz4f\") pod \"collect-profiles-29322150-w4jvp\" (UID: \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" Oct 01 14:30:00 crc kubenswrapper[4851]: I1001 14:30:00.500623 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" Oct 01 14:30:01 crc kubenswrapper[4851]: I1001 14:30:01.017694 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp"] Oct 01 14:30:01 crc kubenswrapper[4851]: I1001 14:30:01.308565 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" event={"ID":"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f","Type":"ContainerStarted","Data":"e18c3829e0ea09825095511e21123235ae0e7b86989c0c3f7c7deb58631dedfb"} Oct 01 14:30:02 crc kubenswrapper[4851]: I1001 14:30:02.324362 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" event={"ID":"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f","Type":"ContainerStarted","Data":"c2ff60dab06bfc93d3bfcbe2abcabdc5206deff591e00cb8628c07a7d31e1c19"} Oct 01 14:30:02 crc kubenswrapper[4851]: I1001 14:30:02.347142 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" podStartSLOduration=2.347117423 podStartE2EDuration="2.347117423s" podCreationTimestamp="2025-10-01 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:30:02.345371853 +0000 UTC m=+5810.690489389" watchObservedRunningTime="2025-10-01 14:30:02.347117423 +0000 UTC m=+5810.692234949" Oct 01 14:30:03 crc kubenswrapper[4851]: I1001 14:30:03.339523 4851 generic.go:334] "Generic (PLEG): container finished" podID="c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f" containerID="c2ff60dab06bfc93d3bfcbe2abcabdc5206deff591e00cb8628c07a7d31e1c19" exitCode=0 Oct 01 14:30:03 crc kubenswrapper[4851]: I1001 14:30:03.339595 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" event={"ID":"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f","Type":"ContainerDied","Data":"c2ff60dab06bfc93d3bfcbe2abcabdc5206deff591e00cb8628c07a7d31e1c19"} Oct 01 14:30:04 crc kubenswrapper[4851]: I1001 14:30:04.709904 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" Oct 01 14:30:04 crc kubenswrapper[4851]: I1001 14:30:04.798156 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-config-volume\") pod \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\" (UID: \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\") " Oct 01 14:30:04 crc kubenswrapper[4851]: I1001 14:30:04.798367 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gz4f\" (UniqueName: \"kubernetes.io/projected/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-kube-api-access-7gz4f\") pod \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\" (UID: \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\") " Oct 01 14:30:04 crc kubenswrapper[4851]: I1001 14:30:04.798537 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-secret-volume\") pod \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\" (UID: \"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f\") " Oct 01 14:30:04 crc kubenswrapper[4851]: I1001 14:30:04.799049 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-config-volume" (OuterVolumeSpecName: "config-volume") pod "c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f" (UID: "c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:30:04 crc kubenswrapper[4851]: I1001 14:30:04.799259 4851 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:30:04 crc kubenswrapper[4851]: I1001 14:30:04.806640 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-kube-api-access-7gz4f" (OuterVolumeSpecName: "kube-api-access-7gz4f") pod "c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f" (UID: "c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f"). InnerVolumeSpecName "kube-api-access-7gz4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:30:04 crc kubenswrapper[4851]: I1001 14:30:04.812585 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f" (UID: "c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:30:04 crc kubenswrapper[4851]: I1001 14:30:04.900668 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gz4f\" (UniqueName: \"kubernetes.io/projected/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-kube-api-access-7gz4f\") on node \"crc\" DevicePath \"\"" Oct 01 14:30:04 crc kubenswrapper[4851]: I1001 14:30:04.900701 4851 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:30:05 crc kubenswrapper[4851]: I1001 14:30:05.364788 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" event={"ID":"c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f","Type":"ContainerDied","Data":"e18c3829e0ea09825095511e21123235ae0e7b86989c0c3f7c7deb58631dedfb"} Oct 01 14:30:05 crc kubenswrapper[4851]: I1001 14:30:05.364847 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e18c3829e0ea09825095511e21123235ae0e7b86989c0c3f7c7deb58631dedfb" Oct 01 14:30:05 crc kubenswrapper[4851]: I1001 14:30:05.364858 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-w4jvp" Oct 01 14:30:05 crc kubenswrapper[4851]: I1001 14:30:05.445484 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2"] Oct 01 14:30:05 crc kubenswrapper[4851]: I1001 14:30:05.457751 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-2gdk2"] Oct 01 14:30:06 crc kubenswrapper[4851]: I1001 14:30:06.329434 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:30:06 crc kubenswrapper[4851]: E1001 14:30:06.329889 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:30:06 crc kubenswrapper[4851]: I1001 14:30:06.350041 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d95900f-86f8-43b8-b67c-cc68b42d1713" path="/var/lib/kubelet/pods/5d95900f-86f8-43b8-b67c-cc68b42d1713/volumes" Oct 01 14:30:20 crc kubenswrapper[4851]: I1001 14:30:20.329148 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:30:20 crc kubenswrapper[4851]: E1001 14:30:20.330214 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:30:20 crc kubenswrapper[4851]: I1001 14:30:20.712094 4851 scope.go:117] "RemoveContainer" containerID="1d193f9e6421e7a5a852991402ebf6f4d94cc4bd2135630cccd33ecb428b9169" Oct 01 14:30:30 crc kubenswrapper[4851]: I1001 14:30:30.802211 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tqcxc"] Oct 01 14:30:30 crc kubenswrapper[4851]: E1001 14:30:30.803954 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f" containerName="collect-profiles" Oct 01 14:30:30 crc kubenswrapper[4851]: I1001 14:30:30.803974 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f" containerName="collect-profiles" Oct 01 14:30:30 crc kubenswrapper[4851]: I1001 14:30:30.804248 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="c96b5d2f-3b43-4230-9b2c-0fcfe44b5f1f" containerName="collect-profiles" Oct 01 14:30:30 crc kubenswrapper[4851]: I1001 14:30:30.806168 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:30 crc kubenswrapper[4851]: I1001 14:30:30.826399 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqcxc"] Oct 01 14:30:30 crc kubenswrapper[4851]: I1001 14:30:30.906949 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1193a35b-978e-4d47-9922-c5ad68bcc819-utilities\") pod \"redhat-marketplace-tqcxc\" (UID: \"1193a35b-978e-4d47-9922-c5ad68bcc819\") " pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:30 crc kubenswrapper[4851]: I1001 14:30:30.907068 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1193a35b-978e-4d47-9922-c5ad68bcc819-catalog-content\") pod \"redhat-marketplace-tqcxc\" (UID: \"1193a35b-978e-4d47-9922-c5ad68bcc819\") " pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:30 crc kubenswrapper[4851]: I1001 14:30:30.907194 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w22gw\" (UniqueName: \"kubernetes.io/projected/1193a35b-978e-4d47-9922-c5ad68bcc819-kube-api-access-w22gw\") pod \"redhat-marketplace-tqcxc\" (UID: \"1193a35b-978e-4d47-9922-c5ad68bcc819\") " pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:31 crc kubenswrapper[4851]: I1001 14:30:31.008627 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1193a35b-978e-4d47-9922-c5ad68bcc819-utilities\") pod \"redhat-marketplace-tqcxc\" (UID: \"1193a35b-978e-4d47-9922-c5ad68bcc819\") " pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:31 crc kubenswrapper[4851]: I1001 14:30:31.008754 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1193a35b-978e-4d47-9922-c5ad68bcc819-catalog-content\") pod \"redhat-marketplace-tqcxc\" (UID: \"1193a35b-978e-4d47-9922-c5ad68bcc819\") " pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:31 crc kubenswrapper[4851]: I1001 14:30:31.008885 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w22gw\" (UniqueName: \"kubernetes.io/projected/1193a35b-978e-4d47-9922-c5ad68bcc819-kube-api-access-w22gw\") pod \"redhat-marketplace-tqcxc\" (UID: \"1193a35b-978e-4d47-9922-c5ad68bcc819\") " pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:31 crc kubenswrapper[4851]: I1001 14:30:31.009197 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1193a35b-978e-4d47-9922-c5ad68bcc819-utilities\") pod \"redhat-marketplace-tqcxc\" (UID: \"1193a35b-978e-4d47-9922-c5ad68bcc819\") " pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:31 crc kubenswrapper[4851]: I1001 14:30:31.009271 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1193a35b-978e-4d47-9922-c5ad68bcc819-catalog-content\") pod \"redhat-marketplace-tqcxc\" (UID: \"1193a35b-978e-4d47-9922-c5ad68bcc819\") " pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:31 crc kubenswrapper[4851]: I1001 14:30:31.031669 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w22gw\" (UniqueName: \"kubernetes.io/projected/1193a35b-978e-4d47-9922-c5ad68bcc819-kube-api-access-w22gw\") pod \"redhat-marketplace-tqcxc\" (UID: \"1193a35b-978e-4d47-9922-c5ad68bcc819\") " pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:31 crc kubenswrapper[4851]: I1001 14:30:31.140865 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:31 crc kubenswrapper[4851]: I1001 14:30:31.594005 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqcxc"] Oct 01 14:30:31 crc kubenswrapper[4851]: I1001 14:30:31.689152 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqcxc" event={"ID":"1193a35b-978e-4d47-9922-c5ad68bcc819","Type":"ContainerStarted","Data":"718b3c379bdf69877a6d760d261fded9b4e4b598b0aa534c2ba3e482f5c1aedd"} Oct 01 14:30:32 crc kubenswrapper[4851]: I1001 14:30:32.701823 4851 generic.go:334] "Generic (PLEG): container finished" podID="1193a35b-978e-4d47-9922-c5ad68bcc819" containerID="2c24992a411b54f1780f7e8c97b682d8bba089d5789ff56a6b058ad9f6ce7513" exitCode=0 Oct 01 14:30:32 crc kubenswrapper[4851]: I1001 14:30:32.701935 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqcxc" event={"ID":"1193a35b-978e-4d47-9922-c5ad68bcc819","Type":"ContainerDied","Data":"2c24992a411b54f1780f7e8c97b682d8bba089d5789ff56a6b058ad9f6ce7513"} Oct 01 14:30:34 crc kubenswrapper[4851]: I1001 14:30:34.328330 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:30:34 crc kubenswrapper[4851]: E1001 14:30:34.328633 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:30:35 crc kubenswrapper[4851]: I1001 14:30:35.737484 4851 generic.go:334] "Generic (PLEG): container finished" podID="1193a35b-978e-4d47-9922-c5ad68bcc819" containerID="672a6df3b4c0780c3567ee01ad0bccfac2bb9c56af5b4ef0bb7655b43aa26b70" exitCode=0 Oct 01 14:30:35 crc kubenswrapper[4851]: I1001 14:30:35.737574 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqcxc" event={"ID":"1193a35b-978e-4d47-9922-c5ad68bcc819","Type":"ContainerDied","Data":"672a6df3b4c0780c3567ee01ad0bccfac2bb9c56af5b4ef0bb7655b43aa26b70"} Oct 01 14:30:37 crc kubenswrapper[4851]: I1001 14:30:37.759335 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqcxc" event={"ID":"1193a35b-978e-4d47-9922-c5ad68bcc819","Type":"ContainerStarted","Data":"9d91e2decd3260ae9987d27dc8c03c275213e8e575a7657dad6d182726cda63d"} Oct 01 14:30:37 crc kubenswrapper[4851]: I1001 14:30:37.782297 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tqcxc" podStartSLOduration=3.601824212 podStartE2EDuration="7.782272181s" podCreationTimestamp="2025-10-01 14:30:30 +0000 UTC" firstStartedPulling="2025-10-01 14:30:32.704531391 +0000 UTC m=+5841.049648877" lastFinishedPulling="2025-10-01 14:30:36.88497932 +0000 UTC m=+5845.230096846" observedRunningTime="2025-10-01 14:30:37.780611374 +0000 UTC m=+5846.125728860" watchObservedRunningTime="2025-10-01 14:30:37.782272181 +0000 UTC m=+5846.127389697" Oct 01 14:30:41 crc kubenswrapper[4851]: I1001 14:30:41.141961 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:41 crc kubenswrapper[4851]: I1001 14:30:41.144469 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:41 crc kubenswrapper[4851]: I1001 14:30:41.229278 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:42 crc kubenswrapper[4851]: I1001 14:30:42.885074 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:42 crc kubenswrapper[4851]: I1001 14:30:42.954785 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqcxc"] Oct 01 14:30:44 crc kubenswrapper[4851]: I1001 14:30:44.829426 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tqcxc" podUID="1193a35b-978e-4d47-9922-c5ad68bcc819" containerName="registry-server" containerID="cri-o://9d91e2decd3260ae9987d27dc8c03c275213e8e575a7657dad6d182726cda63d" gracePeriod=2 Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.294055 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.319951 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1193a35b-978e-4d47-9922-c5ad68bcc819-catalog-content\") pod \"1193a35b-978e-4d47-9922-c5ad68bcc819\" (UID: \"1193a35b-978e-4d47-9922-c5ad68bcc819\") " Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.320109 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1193a35b-978e-4d47-9922-c5ad68bcc819-utilities\") pod \"1193a35b-978e-4d47-9922-c5ad68bcc819\" (UID: \"1193a35b-978e-4d47-9922-c5ad68bcc819\") " Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.320370 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w22gw\" (UniqueName: \"kubernetes.io/projected/1193a35b-978e-4d47-9922-c5ad68bcc819-kube-api-access-w22gw\") pod \"1193a35b-978e-4d47-9922-c5ad68bcc819\" (UID: \"1193a35b-978e-4d47-9922-c5ad68bcc819\") " Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.321180 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1193a35b-978e-4d47-9922-c5ad68bcc819-utilities" (OuterVolumeSpecName: "utilities") pod "1193a35b-978e-4d47-9922-c5ad68bcc819" (UID: "1193a35b-978e-4d47-9922-c5ad68bcc819"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.327068 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1193a35b-978e-4d47-9922-c5ad68bcc819-kube-api-access-w22gw" (OuterVolumeSpecName: "kube-api-access-w22gw") pod "1193a35b-978e-4d47-9922-c5ad68bcc819" (UID: "1193a35b-978e-4d47-9922-c5ad68bcc819"). InnerVolumeSpecName "kube-api-access-w22gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.338988 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1193a35b-978e-4d47-9922-c5ad68bcc819-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1193a35b-978e-4d47-9922-c5ad68bcc819" (UID: "1193a35b-978e-4d47-9922-c5ad68bcc819"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.423085 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1193a35b-978e-4d47-9922-c5ad68bcc819-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.423120 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1193a35b-978e-4d47-9922-c5ad68bcc819-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.423130 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w22gw\" (UniqueName: \"kubernetes.io/projected/1193a35b-978e-4d47-9922-c5ad68bcc819-kube-api-access-w22gw\") on node \"crc\" DevicePath \"\"" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.840648 4851 generic.go:334] "Generic (PLEG): container finished" podID="1193a35b-978e-4d47-9922-c5ad68bcc819" containerID="9d91e2decd3260ae9987d27dc8c03c275213e8e575a7657dad6d182726cda63d" exitCode=0 Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.840693 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqcxc" event={"ID":"1193a35b-978e-4d47-9922-c5ad68bcc819","Type":"ContainerDied","Data":"9d91e2decd3260ae9987d27dc8c03c275213e8e575a7657dad6d182726cda63d"} Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.840723 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqcxc" event={"ID":"1193a35b-978e-4d47-9922-c5ad68bcc819","Type":"ContainerDied","Data":"718b3c379bdf69877a6d760d261fded9b4e4b598b0aa534c2ba3e482f5c1aedd"} Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.840745 4851 scope.go:117] "RemoveContainer" containerID="9d91e2decd3260ae9987d27dc8c03c275213e8e575a7657dad6d182726cda63d" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.841621 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqcxc" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.870407 4851 scope.go:117] "RemoveContainer" containerID="672a6df3b4c0780c3567ee01ad0bccfac2bb9c56af5b4ef0bb7655b43aa26b70" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.892996 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqcxc"] Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.901378 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqcxc"] Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.911712 4851 scope.go:117] "RemoveContainer" containerID="2c24992a411b54f1780f7e8c97b682d8bba089d5789ff56a6b058ad9f6ce7513" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.938877 4851 scope.go:117] "RemoveContainer" containerID="9d91e2decd3260ae9987d27dc8c03c275213e8e575a7657dad6d182726cda63d" Oct 01 14:30:45 crc kubenswrapper[4851]: E1001 14:30:45.941618 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d91e2decd3260ae9987d27dc8c03c275213e8e575a7657dad6d182726cda63d\": container with ID starting with 9d91e2decd3260ae9987d27dc8c03c275213e8e575a7657dad6d182726cda63d not found: ID does not exist" containerID="9d91e2decd3260ae9987d27dc8c03c275213e8e575a7657dad6d182726cda63d" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.941783 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d91e2decd3260ae9987d27dc8c03c275213e8e575a7657dad6d182726cda63d"} err="failed to get container status \"9d91e2decd3260ae9987d27dc8c03c275213e8e575a7657dad6d182726cda63d\": rpc error: code = NotFound desc = could not find container \"9d91e2decd3260ae9987d27dc8c03c275213e8e575a7657dad6d182726cda63d\": container with ID starting with 9d91e2decd3260ae9987d27dc8c03c275213e8e575a7657dad6d182726cda63d not found: ID does not exist" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.941813 4851 scope.go:117] "RemoveContainer" containerID="672a6df3b4c0780c3567ee01ad0bccfac2bb9c56af5b4ef0bb7655b43aa26b70" Oct 01 14:30:45 crc kubenswrapper[4851]: E1001 14:30:45.942148 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"672a6df3b4c0780c3567ee01ad0bccfac2bb9c56af5b4ef0bb7655b43aa26b70\": container with ID starting with 672a6df3b4c0780c3567ee01ad0bccfac2bb9c56af5b4ef0bb7655b43aa26b70 not found: ID does not exist" containerID="672a6df3b4c0780c3567ee01ad0bccfac2bb9c56af5b4ef0bb7655b43aa26b70" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.942171 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"672a6df3b4c0780c3567ee01ad0bccfac2bb9c56af5b4ef0bb7655b43aa26b70"} err="failed to get container status \"672a6df3b4c0780c3567ee01ad0bccfac2bb9c56af5b4ef0bb7655b43aa26b70\": rpc error: code = NotFound desc = could not find container \"672a6df3b4c0780c3567ee01ad0bccfac2bb9c56af5b4ef0bb7655b43aa26b70\": container with ID starting with 672a6df3b4c0780c3567ee01ad0bccfac2bb9c56af5b4ef0bb7655b43aa26b70 not found: ID does not exist" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.942200 4851 scope.go:117] "RemoveContainer" containerID="2c24992a411b54f1780f7e8c97b682d8bba089d5789ff56a6b058ad9f6ce7513" Oct 01 14:30:45 crc kubenswrapper[4851]: E1001 14:30:45.942416 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c24992a411b54f1780f7e8c97b682d8bba089d5789ff56a6b058ad9f6ce7513\": container with ID starting with 2c24992a411b54f1780f7e8c97b682d8bba089d5789ff56a6b058ad9f6ce7513 not found: ID does not exist" containerID="2c24992a411b54f1780f7e8c97b682d8bba089d5789ff56a6b058ad9f6ce7513" Oct 01 14:30:45 crc kubenswrapper[4851]: I1001 14:30:45.942454 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c24992a411b54f1780f7e8c97b682d8bba089d5789ff56a6b058ad9f6ce7513"} err="failed to get container status \"2c24992a411b54f1780f7e8c97b682d8bba089d5789ff56a6b058ad9f6ce7513\": rpc error: code = NotFound desc = could not find container \"2c24992a411b54f1780f7e8c97b682d8bba089d5789ff56a6b058ad9f6ce7513\": container with ID starting with 2c24992a411b54f1780f7e8c97b682d8bba089d5789ff56a6b058ad9f6ce7513 not found: ID does not exist" Oct 01 14:30:46 crc kubenswrapper[4851]: I1001 14:30:46.341082 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1193a35b-978e-4d47-9922-c5ad68bcc819" path="/var/lib/kubelet/pods/1193a35b-978e-4d47-9922-c5ad68bcc819/volumes" Oct 01 14:30:47 crc kubenswrapper[4851]: I1001 14:30:47.328158 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:30:47 crc kubenswrapper[4851]: E1001 14:30:47.328578 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:30:58 crc kubenswrapper[4851]: I1001 14:30:58.328645 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:30:58 crc kubenswrapper[4851]: E1001 14:30:58.329361 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:31:13 crc kubenswrapper[4851]: I1001 14:31:13.328973 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:31:13 crc kubenswrapper[4851]: E1001 14:31:13.330088 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:31:18 crc kubenswrapper[4851]: I1001 14:31:18.221310 4851 generic.go:334] "Generic (PLEG): container finished" podID="996ff379-292e-4a71-a09b-164fc21abe76" containerID="5b4251985f79c2018bf83d1cebb7382dd71c6ec474bbce4b04d12541a2e3d24b" exitCode=0 Oct 01 14:31:18 crc kubenswrapper[4851]: I1001 14:31:18.221460 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"996ff379-292e-4a71-a09b-164fc21abe76","Type":"ContainerDied","Data":"5b4251985f79c2018bf83d1cebb7382dd71c6ec474bbce4b04d12541a2e3d24b"} Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.631832 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.758983 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/996ff379-292e-4a71-a09b-164fc21abe76-test-operator-ephemeral-workdir\") pod \"996ff379-292e-4a71-a09b-164fc21abe76\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.759113 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/996ff379-292e-4a71-a09b-164fc21abe76-config-data\") pod \"996ff379-292e-4a71-a09b-164fc21abe76\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.759213 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/996ff379-292e-4a71-a09b-164fc21abe76-test-operator-ephemeral-temporary\") pod \"996ff379-292e-4a71-a09b-164fc21abe76\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.759237 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-ca-certs\") pod \"996ff379-292e-4a71-a09b-164fc21abe76\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.759324 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-ssh-key\") pod \"996ff379-292e-4a71-a09b-164fc21abe76\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.759389 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"996ff379-292e-4a71-a09b-164fc21abe76\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.759435 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/996ff379-292e-4a71-a09b-164fc21abe76-openstack-config\") pod \"996ff379-292e-4a71-a09b-164fc21abe76\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.759565 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7x6r\" (UniqueName: \"kubernetes.io/projected/996ff379-292e-4a71-a09b-164fc21abe76-kube-api-access-f7x6r\") pod \"996ff379-292e-4a71-a09b-164fc21abe76\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.759598 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-openstack-config-secret\") pod \"996ff379-292e-4a71-a09b-164fc21abe76\" (UID: \"996ff379-292e-4a71-a09b-164fc21abe76\") " Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.759960 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/996ff379-292e-4a71-a09b-164fc21abe76-config-data" (OuterVolumeSpecName: "config-data") pod "996ff379-292e-4a71-a09b-164fc21abe76" (UID: "996ff379-292e-4a71-a09b-164fc21abe76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.760492 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/996ff379-292e-4a71-a09b-164fc21abe76-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "996ff379-292e-4a71-a09b-164fc21abe76" (UID: "996ff379-292e-4a71-a09b-164fc21abe76"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.763686 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/996ff379-292e-4a71-a09b-164fc21abe76-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "996ff379-292e-4a71-a09b-164fc21abe76" (UID: "996ff379-292e-4a71-a09b-164fc21abe76"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.764624 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "996ff379-292e-4a71-a09b-164fc21abe76" (UID: "996ff379-292e-4a71-a09b-164fc21abe76"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.766146 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/996ff379-292e-4a71-a09b-164fc21abe76-kube-api-access-f7x6r" (OuterVolumeSpecName: "kube-api-access-f7x6r") pod "996ff379-292e-4a71-a09b-164fc21abe76" (UID: "996ff379-292e-4a71-a09b-164fc21abe76"). InnerVolumeSpecName "kube-api-access-f7x6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.792312 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "996ff379-292e-4a71-a09b-164fc21abe76" (UID: "996ff379-292e-4a71-a09b-164fc21abe76"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.792612 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "996ff379-292e-4a71-a09b-164fc21abe76" (UID: "996ff379-292e-4a71-a09b-164fc21abe76"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.792727 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "996ff379-292e-4a71-a09b-164fc21abe76" (UID: "996ff379-292e-4a71-a09b-164fc21abe76"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.837867 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/996ff379-292e-4a71-a09b-164fc21abe76-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "996ff379-292e-4a71-a09b-164fc21abe76" (UID: "996ff379-292e-4a71-a09b-164fc21abe76"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.861591 4851 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/996ff379-292e-4a71-a09b-164fc21abe76-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.861627 4851 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.861636 4851 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.861668 4851 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.861679 4851 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/996ff379-292e-4a71-a09b-164fc21abe76-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.861690 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7x6r\" (UniqueName: \"kubernetes.io/projected/996ff379-292e-4a71-a09b-164fc21abe76-kube-api-access-f7x6r\") on node \"crc\" DevicePath \"\"" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.861701 4851 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/996ff379-292e-4a71-a09b-164fc21abe76-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.861709 4851 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/996ff379-292e-4a71-a09b-164fc21abe76-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.861717 4851 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/996ff379-292e-4a71-a09b-164fc21abe76-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.882113 4851 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 01 14:31:19 crc kubenswrapper[4851]: I1001 14:31:19.963349 4851 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 01 14:31:20 crc kubenswrapper[4851]: I1001 14:31:20.245864 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"996ff379-292e-4a71-a09b-164fc21abe76","Type":"ContainerDied","Data":"f2a17b43e5559a79bbc43654e356c244a74fa6c4c363203ec6011af30af38f5f"} Oct 01 14:31:20 crc kubenswrapper[4851]: I1001 14:31:20.245911 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2a17b43e5559a79bbc43654e356c244a74fa6c4c363203ec6011af30af38f5f" Oct 01 14:31:20 crc kubenswrapper[4851]: I1001 14:31:20.245943 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.434976 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 14:31:24 crc kubenswrapper[4851]: E1001 14:31:24.436007 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1193a35b-978e-4d47-9922-c5ad68bcc819" containerName="extract-utilities" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.436035 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1193a35b-978e-4d47-9922-c5ad68bcc819" containerName="extract-utilities" Oct 01 14:31:24 crc kubenswrapper[4851]: E1001 14:31:24.436079 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1193a35b-978e-4d47-9922-c5ad68bcc819" containerName="registry-server" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.436089 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1193a35b-978e-4d47-9922-c5ad68bcc819" containerName="registry-server" Oct 01 14:31:24 crc kubenswrapper[4851]: E1001 14:31:24.436134 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1193a35b-978e-4d47-9922-c5ad68bcc819" containerName="extract-content" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.436146 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="1193a35b-978e-4d47-9922-c5ad68bcc819" containerName="extract-content" Oct 01 14:31:24 crc kubenswrapper[4851]: E1001 14:31:24.436174 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="996ff379-292e-4a71-a09b-164fc21abe76" containerName="tempest-tests-tempest-tests-runner" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.436183 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="996ff379-292e-4a71-a09b-164fc21abe76" containerName="tempest-tests-tempest-tests-runner" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.436459 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="996ff379-292e-4a71-a09b-164fc21abe76" containerName="tempest-tests-tempest-tests-runner" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.436484 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="1193a35b-978e-4d47-9922-c5ad68bcc819" containerName="registry-server" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.437437 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.440004 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-wqpgn" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.447878 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.569385 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shqjf\" (UniqueName: \"kubernetes.io/projected/4b08465d-adad-4a72-b5bb-a50af717f8f6-kube-api-access-shqjf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4b08465d-adad-4a72-b5bb-a50af717f8f6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.569460 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4b08465d-adad-4a72-b5bb-a50af717f8f6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.670614 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shqjf\" (UniqueName: \"kubernetes.io/projected/4b08465d-adad-4a72-b5bb-a50af717f8f6-kube-api-access-shqjf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4b08465d-adad-4a72-b5bb-a50af717f8f6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.670689 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4b08465d-adad-4a72-b5bb-a50af717f8f6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.671333 4851 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4b08465d-adad-4a72-b5bb-a50af717f8f6\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.707648 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shqjf\" (UniqueName: \"kubernetes.io/projected/4b08465d-adad-4a72-b5bb-a50af717f8f6-kube-api-access-shqjf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4b08465d-adad-4a72-b5bb-a50af717f8f6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.718438 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4b08465d-adad-4a72-b5bb-a50af717f8f6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:31:24 crc kubenswrapper[4851]: I1001 14:31:24.766985 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:31:25 crc kubenswrapper[4851]: I1001 14:31:25.264253 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 14:31:25 crc kubenswrapper[4851]: W1001 14:31:25.267303 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b08465d_adad_4a72_b5bb_a50af717f8f6.slice/crio-9a11b1d66c98671f90a0ecb0ca6a980146951455e0ccde2d8dbbe07380997f0c WatchSource:0}: Error finding container 9a11b1d66c98671f90a0ecb0ca6a980146951455e0ccde2d8dbbe07380997f0c: Status 404 returned error can't find the container with id 9a11b1d66c98671f90a0ecb0ca6a980146951455e0ccde2d8dbbe07380997f0c Oct 01 14:31:25 crc kubenswrapper[4851]: I1001 14:31:25.300071 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4b08465d-adad-4a72-b5bb-a50af717f8f6","Type":"ContainerStarted","Data":"9a11b1d66c98671f90a0ecb0ca6a980146951455e0ccde2d8dbbe07380997f0c"} Oct 01 14:31:26 crc kubenswrapper[4851]: I1001 14:31:26.329504 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:31:26 crc kubenswrapper[4851]: E1001 14:31:26.330207 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:31:28 crc kubenswrapper[4851]: I1001 14:31:28.342668 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4b08465d-adad-4a72-b5bb-a50af717f8f6","Type":"ContainerStarted","Data":"df315574a1bd02330f854d911850d9e2fe291435cbf395aeb7efebc572d7ed38"} Oct 01 14:31:28 crc kubenswrapper[4851]: I1001 14:31:28.364229 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.410346947 podStartE2EDuration="4.364181083s" podCreationTimestamp="2025-10-01 14:31:24 +0000 UTC" firstStartedPulling="2025-10-01 14:31:25.270278796 +0000 UTC m=+5893.615396322" lastFinishedPulling="2025-10-01 14:31:27.224112932 +0000 UTC m=+5895.569230458" observedRunningTime="2025-10-01 14:31:28.357210524 +0000 UTC m=+5896.702328050" watchObservedRunningTime="2025-10-01 14:31:28.364181083 +0000 UTC m=+5896.709298579" Oct 01 14:31:39 crc kubenswrapper[4851]: I1001 14:31:39.329328 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:31:39 crc kubenswrapper[4851]: E1001 14:31:39.331068 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:31:44 crc kubenswrapper[4851]: I1001 14:31:44.991294 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6dhv/must-gather-t6htd"] Oct 01 14:31:44 crc kubenswrapper[4851]: I1001 14:31:44.993850 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/must-gather-t6htd" Oct 01 14:31:45 crc kubenswrapper[4851]: I1001 14:31:45.002839 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t6dhv/must-gather-t6htd"] Oct 01 14:31:45 crc kubenswrapper[4851]: I1001 14:31:45.003228 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t6dhv"/"openshift-service-ca.crt" Oct 01 14:31:45 crc kubenswrapper[4851]: I1001 14:31:45.003443 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-t6dhv"/"default-dockercfg-wfpzm" Oct 01 14:31:45 crc kubenswrapper[4851]: I1001 14:31:45.003627 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t6dhv"/"kube-root-ca.crt" Oct 01 14:31:45 crc kubenswrapper[4851]: I1001 14:31:45.165264 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfe4a600-52da-4824-8cb1-416ea1ac94dd-must-gather-output\") pod \"must-gather-t6htd\" (UID: \"dfe4a600-52da-4824-8cb1-416ea1ac94dd\") " pod="openshift-must-gather-t6dhv/must-gather-t6htd" Oct 01 14:31:45 crc kubenswrapper[4851]: I1001 14:31:45.165368 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6tdc\" (UniqueName: \"kubernetes.io/projected/dfe4a600-52da-4824-8cb1-416ea1ac94dd-kube-api-access-l6tdc\") pod \"must-gather-t6htd\" (UID: \"dfe4a600-52da-4824-8cb1-416ea1ac94dd\") " pod="openshift-must-gather-t6dhv/must-gather-t6htd" Oct 01 14:31:45 crc kubenswrapper[4851]: I1001 14:31:45.266396 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6tdc\" (UniqueName: \"kubernetes.io/projected/dfe4a600-52da-4824-8cb1-416ea1ac94dd-kube-api-access-l6tdc\") pod \"must-gather-t6htd\" (UID: \"dfe4a600-52da-4824-8cb1-416ea1ac94dd\") " pod="openshift-must-gather-t6dhv/must-gather-t6htd" Oct 01 14:31:45 crc kubenswrapper[4851]: I1001 14:31:45.266524 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfe4a600-52da-4824-8cb1-416ea1ac94dd-must-gather-output\") pod \"must-gather-t6htd\" (UID: \"dfe4a600-52da-4824-8cb1-416ea1ac94dd\") " pod="openshift-must-gather-t6dhv/must-gather-t6htd" Oct 01 14:31:45 crc kubenswrapper[4851]: I1001 14:31:45.266993 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfe4a600-52da-4824-8cb1-416ea1ac94dd-must-gather-output\") pod \"must-gather-t6htd\" (UID: \"dfe4a600-52da-4824-8cb1-416ea1ac94dd\") " pod="openshift-must-gather-t6dhv/must-gather-t6htd" Oct 01 14:31:45 crc kubenswrapper[4851]: I1001 14:31:45.288988 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6tdc\" (UniqueName: \"kubernetes.io/projected/dfe4a600-52da-4824-8cb1-416ea1ac94dd-kube-api-access-l6tdc\") pod \"must-gather-t6htd\" (UID: \"dfe4a600-52da-4824-8cb1-416ea1ac94dd\") " pod="openshift-must-gather-t6dhv/must-gather-t6htd" Oct 01 14:31:45 crc kubenswrapper[4851]: I1001 14:31:45.317423 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/must-gather-t6htd" Oct 01 14:31:45 crc kubenswrapper[4851]: I1001 14:31:45.770267 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t6dhv/must-gather-t6htd"] Oct 01 14:31:46 crc kubenswrapper[4851]: I1001 14:31:46.552397 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6dhv/must-gather-t6htd" event={"ID":"dfe4a600-52da-4824-8cb1-416ea1ac94dd","Type":"ContainerStarted","Data":"be0719f80b9f9db5153d84e207e6e44ba9a55e26cc9c2023e9624d828fe7e8da"} Oct 01 14:31:52 crc kubenswrapper[4851]: I1001 14:31:52.627768 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6dhv/must-gather-t6htd" event={"ID":"dfe4a600-52da-4824-8cb1-416ea1ac94dd","Type":"ContainerStarted","Data":"bd04686090950d230ccec218de316aaa5002ccfc6e816b6fa8e2fb9f2681ca52"} Oct 01 14:31:52 crc kubenswrapper[4851]: I1001 14:31:52.628303 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6dhv/must-gather-t6htd" event={"ID":"dfe4a600-52da-4824-8cb1-416ea1ac94dd","Type":"ContainerStarted","Data":"9c569db2c115195f45e439d574aa36a7fabec9d8418ab7a9b880d942bb705fd2"} Oct 01 14:31:52 crc kubenswrapper[4851]: I1001 14:31:52.659698 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t6dhv/must-gather-t6htd" podStartSLOduration=2.7211181509999998 podStartE2EDuration="8.65967626s" podCreationTimestamp="2025-10-01 14:31:44 +0000 UTC" firstStartedPulling="2025-10-01 14:31:45.775420988 +0000 UTC m=+5914.120538474" lastFinishedPulling="2025-10-01 14:31:51.713979097 +0000 UTC m=+5920.059096583" observedRunningTime="2025-10-01 14:31:52.649328275 +0000 UTC m=+5920.994445761" watchObservedRunningTime="2025-10-01 14:31:52.65967626 +0000 UTC m=+5921.004793756" Oct 01 14:31:53 crc kubenswrapper[4851]: I1001 14:31:53.329440 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:31:53 crc kubenswrapper[4851]: E1001 14:31:53.330182 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:31:56 crc kubenswrapper[4851]: I1001 14:31:56.187548 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6dhv/crc-debug-fvwgv"] Oct 01 14:31:56 crc kubenswrapper[4851]: I1001 14:31:56.189778 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/crc-debug-fvwgv" Oct 01 14:31:56 crc kubenswrapper[4851]: I1001 14:31:56.302406 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58db37d1-3709-42bc-af9c-dc416e1c4654-host\") pod \"crc-debug-fvwgv\" (UID: \"58db37d1-3709-42bc-af9c-dc416e1c4654\") " pod="openshift-must-gather-t6dhv/crc-debug-fvwgv" Oct 01 14:31:56 crc kubenswrapper[4851]: I1001 14:31:56.302563 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6g9v\" (UniqueName: \"kubernetes.io/projected/58db37d1-3709-42bc-af9c-dc416e1c4654-kube-api-access-v6g9v\") pod \"crc-debug-fvwgv\" (UID: \"58db37d1-3709-42bc-af9c-dc416e1c4654\") " pod="openshift-must-gather-t6dhv/crc-debug-fvwgv" Oct 01 14:31:56 crc kubenswrapper[4851]: I1001 14:31:56.404634 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58db37d1-3709-42bc-af9c-dc416e1c4654-host\") pod \"crc-debug-fvwgv\" (UID: \"58db37d1-3709-42bc-af9c-dc416e1c4654\") " pod="openshift-must-gather-t6dhv/crc-debug-fvwgv" Oct 01 14:31:56 crc kubenswrapper[4851]: I1001 14:31:56.404721 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6g9v\" (UniqueName: \"kubernetes.io/projected/58db37d1-3709-42bc-af9c-dc416e1c4654-kube-api-access-v6g9v\") pod \"crc-debug-fvwgv\" (UID: \"58db37d1-3709-42bc-af9c-dc416e1c4654\") " pod="openshift-must-gather-t6dhv/crc-debug-fvwgv" Oct 01 14:31:56 crc kubenswrapper[4851]: I1001 14:31:56.404801 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58db37d1-3709-42bc-af9c-dc416e1c4654-host\") pod \"crc-debug-fvwgv\" (UID: \"58db37d1-3709-42bc-af9c-dc416e1c4654\") " pod="openshift-must-gather-t6dhv/crc-debug-fvwgv" Oct 01 14:31:56 crc kubenswrapper[4851]: I1001 14:31:56.425862 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6g9v\" (UniqueName: \"kubernetes.io/projected/58db37d1-3709-42bc-af9c-dc416e1c4654-kube-api-access-v6g9v\") pod \"crc-debug-fvwgv\" (UID: \"58db37d1-3709-42bc-af9c-dc416e1c4654\") " pod="openshift-must-gather-t6dhv/crc-debug-fvwgv" Oct 01 14:31:56 crc kubenswrapper[4851]: I1001 14:31:56.509487 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/crc-debug-fvwgv" Oct 01 14:31:56 crc kubenswrapper[4851]: W1001 14:31:56.551238 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58db37d1_3709_42bc_af9c_dc416e1c4654.slice/crio-aa3e361229bc76432d915f626fee3e8690f70a51c908352ca304875f48084c8a WatchSource:0}: Error finding container aa3e361229bc76432d915f626fee3e8690f70a51c908352ca304875f48084c8a: Status 404 returned error can't find the container with id aa3e361229bc76432d915f626fee3e8690f70a51c908352ca304875f48084c8a Oct 01 14:31:56 crc kubenswrapper[4851]: I1001 14:31:56.699575 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6dhv/crc-debug-fvwgv" event={"ID":"58db37d1-3709-42bc-af9c-dc416e1c4654","Type":"ContainerStarted","Data":"aa3e361229bc76432d915f626fee3e8690f70a51c908352ca304875f48084c8a"} Oct 01 14:32:08 crc kubenswrapper[4851]: I1001 14:32:08.328269 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:32:08 crc kubenswrapper[4851]: E1001 14:32:08.329150 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:32:09 crc kubenswrapper[4851]: I1001 14:32:09.830158 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6dhv/crc-debug-fvwgv" event={"ID":"58db37d1-3709-42bc-af9c-dc416e1c4654","Type":"ContainerStarted","Data":"24cb527df17535b6fa3e2f033621745f2eed1ce472df593a611a03ea04175e60"} Oct 01 14:32:21 crc kubenswrapper[4851]: I1001 14:32:21.329484 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:32:21 crc kubenswrapper[4851]: E1001 14:32:21.330258 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:32:34 crc kubenswrapper[4851]: I1001 14:32:34.328031 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:32:34 crc kubenswrapper[4851]: E1001 14:32:34.329829 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:32:45 crc kubenswrapper[4851]: I1001 14:32:45.329052 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:32:45 crc kubenswrapper[4851]: E1001 14:32:45.330401 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:32:57 crc kubenswrapper[4851]: I1001 14:32:57.329166 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:32:57 crc kubenswrapper[4851]: E1001 14:32:57.329948 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:33:12 crc kubenswrapper[4851]: I1001 14:33:12.356566 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:33:12 crc kubenswrapper[4851]: E1001 14:33:12.360249 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:33:20 crc kubenswrapper[4851]: I1001 14:33:20.515044 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t6dhv/crc-debug-fvwgv" podStartSLOduration=71.853844722 podStartE2EDuration="1m24.515026039s" podCreationTimestamp="2025-10-01 14:31:56 +0000 UTC" firstStartedPulling="2025-10-01 14:31:56.553757446 +0000 UTC m=+5924.898874932" lastFinishedPulling="2025-10-01 14:32:09.214938773 +0000 UTC m=+5937.560056249" observedRunningTime="2025-10-01 14:32:09.852683316 +0000 UTC m=+5938.197800802" watchObservedRunningTime="2025-10-01 14:33:20.515026039 +0000 UTC m=+6008.860143525" Oct 01 14:33:20 crc kubenswrapper[4851]: I1001 14:33:20.515615 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cvgl7"] Oct 01 14:33:20 crc kubenswrapper[4851]: I1001 14:33:20.526344 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:33:20 crc kubenswrapper[4851]: I1001 14:33:20.531941 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cvgl7"] Oct 01 14:33:20 crc kubenswrapper[4851]: I1001 14:33:20.666416 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4421cd-0461-44ee-b937-b12e4fb50766-catalog-content\") pod \"redhat-operators-cvgl7\" (UID: \"2e4421cd-0461-44ee-b937-b12e4fb50766\") " pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:33:20 crc kubenswrapper[4851]: I1001 14:33:20.666484 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57pd6\" (UniqueName: \"kubernetes.io/projected/2e4421cd-0461-44ee-b937-b12e4fb50766-kube-api-access-57pd6\") pod \"redhat-operators-cvgl7\" (UID: \"2e4421cd-0461-44ee-b937-b12e4fb50766\") " pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:33:20 crc kubenswrapper[4851]: I1001 14:33:20.666652 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4421cd-0461-44ee-b937-b12e4fb50766-utilities\") pod \"redhat-operators-cvgl7\" (UID: \"2e4421cd-0461-44ee-b937-b12e4fb50766\") " pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:33:20 crc kubenswrapper[4851]: I1001 14:33:20.769161 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4421cd-0461-44ee-b937-b12e4fb50766-catalog-content\") pod \"redhat-operators-cvgl7\" (UID: \"2e4421cd-0461-44ee-b937-b12e4fb50766\") " pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:33:20 crc kubenswrapper[4851]: I1001 14:33:20.769261 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57pd6\" (UniqueName: \"kubernetes.io/projected/2e4421cd-0461-44ee-b937-b12e4fb50766-kube-api-access-57pd6\") pod \"redhat-operators-cvgl7\" (UID: \"2e4421cd-0461-44ee-b937-b12e4fb50766\") " pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:33:20 crc kubenswrapper[4851]: I1001 14:33:20.769437 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4421cd-0461-44ee-b937-b12e4fb50766-utilities\") pod \"redhat-operators-cvgl7\" (UID: \"2e4421cd-0461-44ee-b937-b12e4fb50766\") " pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:33:20 crc kubenswrapper[4851]: I1001 14:33:20.769877 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4421cd-0461-44ee-b937-b12e4fb50766-utilities\") pod \"redhat-operators-cvgl7\" (UID: \"2e4421cd-0461-44ee-b937-b12e4fb50766\") " pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:33:20 crc kubenswrapper[4851]: I1001 14:33:20.769892 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4421cd-0461-44ee-b937-b12e4fb50766-catalog-content\") pod \"redhat-operators-cvgl7\" (UID: \"2e4421cd-0461-44ee-b937-b12e4fb50766\") " pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:33:20 crc kubenswrapper[4851]: I1001 14:33:20.793200 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57pd6\" (UniqueName: \"kubernetes.io/projected/2e4421cd-0461-44ee-b937-b12e4fb50766-kube-api-access-57pd6\") pod \"redhat-operators-cvgl7\" (UID: \"2e4421cd-0461-44ee-b937-b12e4fb50766\") " pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:33:20 crc kubenswrapper[4851]: I1001 14:33:20.851048 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:33:21 crc kubenswrapper[4851]: I1001 14:33:21.400355 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cvgl7"] Oct 01 14:33:21 crc kubenswrapper[4851]: I1001 14:33:21.495005 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvgl7" event={"ID":"2e4421cd-0461-44ee-b937-b12e4fb50766","Type":"ContainerStarted","Data":"8c862bfe47bcdafd64beaa4c3e59b695abcbb5a13ce1036d09a0272b9f0bda54"} Oct 01 14:33:21 crc kubenswrapper[4851]: I1001 14:33:21.755581 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-79bfcf4f68-4nz4v_62a44312-8062-4c82-ab00-f87600fa8f93/barbican-api/0.log" Oct 01 14:33:21 crc kubenswrapper[4851]: I1001 14:33:21.924706 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-79bfcf4f68-4nz4v_62a44312-8062-4c82-ab00-f87600fa8f93/barbican-api-log/0.log" Oct 01 14:33:22 crc kubenswrapper[4851]: I1001 14:33:22.145811 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f7d86b4d6-9685z_83d01de2-5036-447d-9bcc-adae38fc5202/barbican-keystone-listener/0.log" Oct 01 14:33:22 crc kubenswrapper[4851]: I1001 14:33:22.489734 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6c49b89697-kzlf9_4fb21634-7084-49a9-88d7-8759e5c794cb/barbican-worker/0.log" Oct 01 14:33:22 crc kubenswrapper[4851]: I1001 14:33:22.586325 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f7d86b4d6-9685z_83d01de2-5036-447d-9bcc-adae38fc5202/barbican-keystone-listener-log/0.log" Oct 01 14:33:22 crc kubenswrapper[4851]: I1001 14:33:22.884141 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6c49b89697-kzlf9_4fb21634-7084-49a9-88d7-8759e5c794cb/barbican-worker-log/0.log" Oct 01 14:33:22 crc kubenswrapper[4851]: I1001 14:33:22.921615 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf_db4056ce-42b4-4853-9e9f-69320e29e5cc/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:23 crc kubenswrapper[4851]: I1001 14:33:23.267824 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5b04518a-1699-4ef1-8b54-57c7343e081c/proxy-httpd/0.log" Oct 01 14:33:23 crc kubenswrapper[4851]: I1001 14:33:23.329859 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5b04518a-1699-4ef1-8b54-57c7343e081c/ceilometer-notification-agent/0.log" Oct 01 14:33:23 crc kubenswrapper[4851]: I1001 14:33:23.369925 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5b04518a-1699-4ef1-8b54-57c7343e081c/ceilometer-central-agent/0.log" Oct 01 14:33:23 crc kubenswrapper[4851]: I1001 14:33:23.516471 4851 generic.go:334] "Generic (PLEG): container finished" podID="2e4421cd-0461-44ee-b937-b12e4fb50766" containerID="bc81e4439a61df1a94dab2657b71ec3e15003a0ac0db35947512432a985dd74d" exitCode=0 Oct 01 14:33:23 crc kubenswrapper[4851]: I1001 14:33:23.516521 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvgl7" event={"ID":"2e4421cd-0461-44ee-b937-b12e4fb50766","Type":"ContainerDied","Data":"bc81e4439a61df1a94dab2657b71ec3e15003a0ac0db35947512432a985dd74d"} Oct 01 14:33:23 crc kubenswrapper[4851]: I1001 14:33:23.546147 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5b04518a-1699-4ef1-8b54-57c7343e081c/sg-core/0.log" Oct 01 14:33:23 crc kubenswrapper[4851]: I1001 14:33:23.828204 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_698a82d3-c7a7-4b4b-8cf9-46f6589744d9/cinder-api-log/0.log" Oct 01 14:33:23 crc kubenswrapper[4851]: I1001 14:33:23.982458 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_698a82d3-c7a7-4b4b-8cf9-46f6589744d9/cinder-api/0.log" Oct 01 14:33:24 crc kubenswrapper[4851]: I1001 14:33:24.100075 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9/probe/0.log" Oct 01 14:33:24 crc kubenswrapper[4851]: I1001 14:33:24.122748 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9/cinder-scheduler/0.log" Oct 01 14:33:24 crc kubenswrapper[4851]: I1001 14:33:24.588271 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw_4e073c07-f76e-424b-a1d1-68fcabf7f063/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:24 crc kubenswrapper[4851]: I1001 14:33:24.632143 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-khbc7_7966e9d5-430c-417e-9ba2-b53c598831e7/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:25 crc kubenswrapper[4851]: I1001 14:33:25.218197 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-659b5fb8c5-5wqbr_73149e01-4273-4703-a6bd-0b44c3ce5aad/init/0.log" Oct 01 14:33:25 crc kubenswrapper[4851]: I1001 14:33:25.218528 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd_0168fd9f-0f7b-432d-a09d-927ac34e34b3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:25 crc kubenswrapper[4851]: I1001 14:33:25.329929 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:33:25 crc kubenswrapper[4851]: E1001 14:33:25.330537 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:33:25 crc kubenswrapper[4851]: I1001 14:33:25.690531 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-659b5fb8c5-5wqbr_73149e01-4273-4703-a6bd-0b44c3ce5aad/init/0.log" Oct 01 14:33:25 crc kubenswrapper[4851]: I1001 14:33:25.719147 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pp46c_6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:25 crc kubenswrapper[4851]: I1001 14:33:25.805956 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-659b5fb8c5-5wqbr_73149e01-4273-4703-a6bd-0b44c3ce5aad/dnsmasq-dns/0.log" Oct 01 14:33:26 crc kubenswrapper[4851]: I1001 14:33:26.163652 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c/glance-httpd/0.log" Oct 01 14:33:26 crc kubenswrapper[4851]: I1001 14:33:26.187789 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c/glance-log/0.log" Oct 01 14:33:26 crc kubenswrapper[4851]: I1001 14:33:26.354383 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0/glance-httpd/0.log" Oct 01 14:33:26 crc kubenswrapper[4851]: I1001 14:33:26.366702 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0/glance-log/0.log" Oct 01 14:33:26 crc kubenswrapper[4851]: I1001 14:33:26.652559 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-58c9859d68-bckn5_2fcf93f8-06db-4cab-8699-9051ca2ae50a/horizon/0.log" Oct 01 14:33:26 crc kubenswrapper[4851]: I1001 14:33:26.752706 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn_dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:26 crc kubenswrapper[4851]: I1001 14:33:26.925601 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-swrg6_b850e5c8-7c3d-4ae2-ab48-a17e80c41091/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:27 crc kubenswrapper[4851]: I1001 14:33:27.288687 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29322121-sgptb_22136cf3-22ce-4ebc-b5b6-0b61f25844e8/keystone-cron/0.log" Oct 01 14:33:27 crc kubenswrapper[4851]: I1001 14:33:27.563664 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_3ce21dde-21c2-49d7-aac0-cba896d9b1de/kube-state-metrics/0.log" Oct 01 14:33:28 crc kubenswrapper[4851]: I1001 14:33:28.036305 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9s94l_18d67c2a-547a-448f-8492-c4c997cc938e/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:28 crc kubenswrapper[4851]: I1001 14:33:28.221350 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-58c9859d68-bckn5_2fcf93f8-06db-4cab-8699-9051ca2ae50a/horizon-log/0.log" Oct 01 14:33:28 crc kubenswrapper[4851]: I1001 14:33:28.356813 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-78d78d4545-tv25n_bdf629fa-5ac6-4985-a03a-c77c19cc9adb/keystone-api/0.log" Oct 01 14:33:28 crc kubenswrapper[4851]: I1001 14:33:28.956292 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57698c5d89-m6vxz_0f07dc57-fd61-4799-8658-3ed1fcc9f01c/neutron-httpd/0.log" Oct 01 14:33:29 crc kubenswrapper[4851]: I1001 14:33:29.198060 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v_f1395fd2-649e-42f8-b320-5d81ae321978/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:29 crc kubenswrapper[4851]: I1001 14:33:29.675916 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57698c5d89-m6vxz_0f07dc57-fd61-4799-8658-3ed1fcc9f01c/neutron-api/0.log" Oct 01 14:33:30 crc kubenswrapper[4851]: I1001 14:33:30.490388 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_1e7d4479-701c-47c9-92b6-31ea543b479f/nova-cell0-conductor-conductor/0.log" Oct 01 14:33:31 crc kubenswrapper[4851]: I1001 14:33:31.064241 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9e2cacf3-b4a4-4839-a723-4bd4578935a7/nova-cell1-conductor-conductor/0.log" Oct 01 14:33:31 crc kubenswrapper[4851]: I1001 14:33:31.605190 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_1ab5594b-8bc0-4c89-b570-604b54931bbe/nova-cell1-novncproxy-novncproxy/0.log" Oct 01 14:33:31 crc kubenswrapper[4851]: I1001 14:33:31.805524 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_88f8fcd9-1105-4592-a420-98ea9033c3d9/nova-api-api/0.log" Oct 01 14:33:31 crc kubenswrapper[4851]: I1001 14:33:31.979287 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_88f8fcd9-1105-4592-a420-98ea9033c3d9/nova-api-log/0.log" Oct 01 14:33:32 crc kubenswrapper[4851]: I1001 14:33:32.193691 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-5sjkx_75a874c6-cd94-467a-ab74-bede44646604/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:32 crc kubenswrapper[4851]: I1001 14:33:32.286620 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522/nova-metadata-log/0.log" Oct 01 14:33:32 crc kubenswrapper[4851]: I1001 14:33:32.935814 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d829ebf1-e5aa-4c23-9a7e-db128f394557/nova-scheduler-scheduler/0.log" Oct 01 14:33:32 crc kubenswrapper[4851]: I1001 14:33:32.954692 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_311e8f50-9e4a-4a03-bc24-04b76d53a238/mysql-bootstrap/0.log" Oct 01 14:33:33 crc kubenswrapper[4851]: I1001 14:33:33.168943 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_311e8f50-9e4a-4a03-bc24-04b76d53a238/galera/0.log" Oct 01 14:33:33 crc kubenswrapper[4851]: I1001 14:33:33.170570 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_311e8f50-9e4a-4a03-bc24-04b76d53a238/mysql-bootstrap/0.log" Oct 01 14:33:33 crc kubenswrapper[4851]: I1001 14:33:33.402722 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7c2ac1cc-c49c-4966-a357-2d1ba04d5671/mysql-bootstrap/0.log" Oct 01 14:33:33 crc kubenswrapper[4851]: I1001 14:33:33.697893 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvgl7" event={"ID":"2e4421cd-0461-44ee-b937-b12e4fb50766","Type":"ContainerStarted","Data":"75565a2cd1f594bbef331f49ceae9be67237f15992070d67ef336bf64e8b3e94"} Oct 01 14:33:33 crc kubenswrapper[4851]: I1001 14:33:33.942850 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7c2ac1cc-c49c-4966-a357-2d1ba04d5671/mysql-bootstrap/0.log" Oct 01 14:33:34 crc kubenswrapper[4851]: I1001 14:33:34.007452 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7c2ac1cc-c49c-4966-a357-2d1ba04d5671/galera/0.log" Oct 01 14:33:34 crc kubenswrapper[4851]: I1001 14:33:34.182238 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f/openstackclient/0.log" Oct 01 14:33:34 crc kubenswrapper[4851]: I1001 14:33:34.371152 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-dmq5k_7b15e5b7-162e-46ee-a292-bf763704cda6/ovn-controller/0.log" Oct 01 14:33:34 crc kubenswrapper[4851]: I1001 14:33:34.565717 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8sm5d_236d88b9-092a-448d-a793-b133f7abe5f9/openstack-network-exporter/0.log" Oct 01 14:33:34 crc kubenswrapper[4851]: I1001 14:33:34.706448 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522/nova-metadata-metadata/0.log" Oct 01 14:33:34 crc kubenswrapper[4851]: I1001 14:33:34.799900 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gt7fw_928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22/ovsdb-server-init/0.log" Oct 01 14:33:35 crc kubenswrapper[4851]: I1001 14:33:35.025527 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gt7fw_928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22/ovsdb-server-init/0.log" Oct 01 14:33:35 crc kubenswrapper[4851]: I1001 14:33:35.041254 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gt7fw_928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22/ovsdb-server/0.log" Oct 01 14:33:35 crc kubenswrapper[4851]: I1001 14:33:35.340731 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bkddc_d579751c-546b-4a48-8ae6-f9753609107a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:35 crc kubenswrapper[4851]: I1001 14:33:35.409311 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gt7fw_928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22/ovs-vswitchd/0.log" Oct 01 14:33:35 crc kubenswrapper[4851]: I1001 14:33:35.572845 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_59b6f03e-998a-46c8-a0d7-ce0cbad79b3b/openstack-network-exporter/0.log" Oct 01 14:33:35 crc kubenswrapper[4851]: I1001 14:33:35.667664 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_59b6f03e-998a-46c8-a0d7-ce0cbad79b3b/ovn-northd/0.log" Oct 01 14:33:35 crc kubenswrapper[4851]: I1001 14:33:35.828564 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f/openstack-network-exporter/0.log" Oct 01 14:33:35 crc kubenswrapper[4851]: I1001 14:33:35.911356 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f/ovsdbserver-nb/0.log" Oct 01 14:33:36 crc kubenswrapper[4851]: I1001 14:33:36.082272 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3739abde-3ff0-4c31-aeb6-f731eb37dfac/openstack-network-exporter/0.log" Oct 01 14:33:36 crc kubenswrapper[4851]: I1001 14:33:36.176762 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3739abde-3ff0-4c31-aeb6-f731eb37dfac/ovsdbserver-sb/0.log" Oct 01 14:33:36 crc kubenswrapper[4851]: I1001 14:33:36.327977 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:33:36 crc kubenswrapper[4851]: E1001 14:33:36.328483 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:33:36 crc kubenswrapper[4851]: I1001 14:33:36.662248 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7c6fb58db4-9tzr2_2845afcf-be46-4aad-a15a-79c3a54b844c/placement-api/0.log" Oct 01 14:33:36 crc kubenswrapper[4851]: I1001 14:33:36.764063 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7c6fb58db4-9tzr2_2845afcf-be46-4aad-a15a-79c3a54b844c/placement-log/0.log" Oct 01 14:33:36 crc kubenswrapper[4851]: I1001 14:33:36.768179 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5c7e1ec-7093-4fa6-acf7-39a2839cfb11/init-config-reloader/0.log" Oct 01 14:33:36 crc kubenswrapper[4851]: I1001 14:33:36.995580 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5c7e1ec-7093-4fa6-acf7-39a2839cfb11/config-reloader/0.log" Oct 01 14:33:37 crc kubenswrapper[4851]: I1001 14:33:37.038706 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5c7e1ec-7093-4fa6-acf7-39a2839cfb11/prometheus/0.log" Oct 01 14:33:37 crc kubenswrapper[4851]: I1001 14:33:37.090742 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5c7e1ec-7093-4fa6-acf7-39a2839cfb11/init-config-reloader/0.log" Oct 01 14:33:37 crc kubenswrapper[4851]: I1001 14:33:37.216617 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5c7e1ec-7093-4fa6-acf7-39a2839cfb11/thanos-sidecar/0.log" Oct 01 14:33:37 crc kubenswrapper[4851]: I1001 14:33:37.315050 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3dc04e94-f66a-4937-86d2-24def7247794/setup-container/0.log" Oct 01 14:33:37 crc kubenswrapper[4851]: I1001 14:33:37.514141 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3dc04e94-f66a-4937-86d2-24def7247794/rabbitmq/0.log" Oct 01 14:33:37 crc kubenswrapper[4851]: I1001 14:33:37.525238 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3dc04e94-f66a-4937-86d2-24def7247794/setup-container/0.log" Oct 01 14:33:37 crc kubenswrapper[4851]: I1001 14:33:37.688682 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb/setup-container/0.log" Oct 01 14:33:37 crc kubenswrapper[4851]: I1001 14:33:37.951557 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_705d247f-3afa-49f5-ba1d-ab991af3e399/memcached/0.log" Oct 01 14:33:37 crc kubenswrapper[4851]: I1001 14:33:37.990951 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb/rabbitmq/0.log" Oct 01 14:33:38 crc kubenswrapper[4851]: I1001 14:33:38.031577 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb/setup-container/0.log" Oct 01 14:33:38 crc kubenswrapper[4851]: I1001 14:33:38.156579 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_73f3b1c2-f1c0-47b8-bf31-4d2a185c852e/setup-container/0.log" Oct 01 14:33:38 crc kubenswrapper[4851]: I1001 14:33:38.322635 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_73f3b1c2-f1c0-47b8-bf31-4d2a185c852e/setup-container/0.log" Oct 01 14:33:38 crc kubenswrapper[4851]: I1001 14:33:38.344660 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_73f3b1c2-f1c0-47b8-bf31-4d2a185c852e/rabbitmq/0.log" Oct 01 14:33:38 crc kubenswrapper[4851]: I1001 14:33:38.366166 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h_6946a3f7-315a-48df-8a02-af0fee0d1fce/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:38 crc kubenswrapper[4851]: I1001 14:33:38.536964 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-c2qg6_c97f93db-5638-4aa6-b20d-98b1602301af/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:38 crc kubenswrapper[4851]: I1001 14:33:38.582844 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47_9eaff7ea-f90a-4cfa-8850-cf308591ca11/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:39 crc kubenswrapper[4851]: I1001 14:33:39.106666 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-qntf8_75fea0b9-3dcb-4301-835b-346cfe0d09d7/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:39 crc kubenswrapper[4851]: I1001 14:33:39.107553 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-2cbdw_f4295079-cbbd-4349-b20b-ce68008d76e9/ssh-known-hosts-edpm-deployment/0.log" Oct 01 14:33:39 crc kubenswrapper[4851]: I1001 14:33:39.228703 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b55c56cb7-m2hqv_39d289ff-07b0-479b-8f7a-bb1e5c108be2/proxy-server/0.log" Oct 01 14:33:39 crc kubenswrapper[4851]: I1001 14:33:39.351315 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b55c56cb7-m2hqv_39d289ff-07b0-479b-8f7a-bb1e5c108be2/proxy-httpd/0.log" Oct 01 14:33:39 crc kubenswrapper[4851]: I1001 14:33:39.401097 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-s5dg8_b0a8dd5b-b066-4203-b283-3ff979e8da98/swift-ring-rebalance/0.log" Oct 01 14:33:39 crc kubenswrapper[4851]: I1001 14:33:39.578123 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/account-auditor/0.log" Oct 01 14:33:39 crc kubenswrapper[4851]: I1001 14:33:39.633545 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/account-reaper/0.log" Oct 01 14:33:39 crc kubenswrapper[4851]: I1001 14:33:39.648145 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/account-replicator/0.log" Oct 01 14:33:39 crc kubenswrapper[4851]: I1001 14:33:39.731315 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/account-server/0.log" Oct 01 14:33:39 crc kubenswrapper[4851]: I1001 14:33:39.780645 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/container-auditor/0.log" Oct 01 14:33:39 crc kubenswrapper[4851]: I1001 14:33:39.840123 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/container-server/0.log" Oct 01 14:33:39 crc kubenswrapper[4851]: I1001 14:33:39.891879 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/container-replicator/0.log" Oct 01 14:33:39 crc kubenswrapper[4851]: I1001 14:33:39.926052 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/container-updater/0.log" Oct 01 14:33:40 crc kubenswrapper[4851]: I1001 14:33:40.003785 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/object-auditor/0.log" Oct 01 14:33:40 crc kubenswrapper[4851]: I1001 14:33:40.098052 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/object-expirer/0.log" Oct 01 14:33:40 crc kubenswrapper[4851]: I1001 14:33:40.126914 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/object-server/0.log" Oct 01 14:33:40 crc kubenswrapper[4851]: I1001 14:33:40.159313 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/object-replicator/0.log" Oct 01 14:33:40 crc kubenswrapper[4851]: I1001 14:33:40.196056 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/object-updater/0.log" Oct 01 14:33:40 crc kubenswrapper[4851]: I1001 14:33:40.282200 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/rsync/0.log" Oct 01 14:33:40 crc kubenswrapper[4851]: I1001 14:33:40.365094 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/swift-recon-cron/0.log" Oct 01 14:33:40 crc kubenswrapper[4851]: I1001 14:33:40.476835 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-kznqk_bd7a7617-48e5-42a8-9630-ce17e87cde69/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:40 crc kubenswrapper[4851]: I1001 14:33:40.614404 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_996ff379-292e-4a71-a09b-164fc21abe76/tempest-tests-tempest-tests-runner/0.log" Oct 01 14:33:40 crc kubenswrapper[4851]: I1001 14:33:40.685582 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4b08465d-adad-4a72-b5bb-a50af717f8f6/test-operator-logs-container/0.log" Oct 01 14:33:40 crc kubenswrapper[4851]: I1001 14:33:40.884917 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6_8457e33a-c243-4a18-80f6-8a1777d60054/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:33:42 crc kubenswrapper[4851]: I1001 14:33:42.534228 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_cad65438-644c-45b7-8267-370126fe6aef/watcher-applier/0.log" Oct 01 14:33:42 crc kubenswrapper[4851]: I1001 14:33:42.813660 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_971ca0ac-6de7-42f1-bf29-5174fd80ced4/watcher-decision-engine/3.log" Oct 01 14:33:43 crc kubenswrapper[4851]: I1001 14:33:43.387342 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_bb173f87-ff1f-4a9f-9de8-5073545e2697/watcher-api-log/0.log" Oct 01 14:33:44 crc kubenswrapper[4851]: I1001 14:33:44.802993 4851 generic.go:334] "Generic (PLEG): container finished" podID="2e4421cd-0461-44ee-b937-b12e4fb50766" containerID="75565a2cd1f594bbef331f49ceae9be67237f15992070d67ef336bf64e8b3e94" exitCode=0 Oct 01 14:33:44 crc kubenswrapper[4851]: I1001 14:33:44.803040 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvgl7" event={"ID":"2e4421cd-0461-44ee-b937-b12e4fb50766","Type":"ContainerDied","Data":"75565a2cd1f594bbef331f49ceae9be67237f15992070d67ef336bf64e8b3e94"} Oct 01 14:33:45 crc kubenswrapper[4851]: I1001 14:33:45.518051 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_971ca0ac-6de7-42f1-bf29-5174fd80ced4/watcher-decision-engine/4.log" Oct 01 14:33:45 crc kubenswrapper[4851]: I1001 14:33:45.974228 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_bb173f87-ff1f-4a9f-9de8-5073545e2697/watcher-api/0.log" Oct 01 14:33:46 crc kubenswrapper[4851]: I1001 14:33:46.825630 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvgl7" event={"ID":"2e4421cd-0461-44ee-b937-b12e4fb50766","Type":"ContainerStarted","Data":"6c4636703de834ae207627751b5cfb92432fa2c43f0ba2c50d84ec6ce26ec6c4"} Oct 01 14:33:46 crc kubenswrapper[4851]: I1001 14:33:46.851110 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cvgl7" podStartSLOduration=5.842230935 podStartE2EDuration="26.851086749s" podCreationTimestamp="2025-10-01 14:33:20 +0000 UTC" firstStartedPulling="2025-10-01 14:33:24.528007839 +0000 UTC m=+6012.873125325" lastFinishedPulling="2025-10-01 14:33:45.536863653 +0000 UTC m=+6033.881981139" observedRunningTime="2025-10-01 14:33:46.843296897 +0000 UTC m=+6035.188414403" watchObservedRunningTime="2025-10-01 14:33:46.851086749 +0000 UTC m=+6035.196204245" Oct 01 14:33:50 crc kubenswrapper[4851]: I1001 14:33:50.328279 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:33:50 crc kubenswrapper[4851]: E1001 14:33:50.329073 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:33:50 crc kubenswrapper[4851]: I1001 14:33:50.851166 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:33:50 crc kubenswrapper[4851]: I1001 14:33:50.852115 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:33:51 crc kubenswrapper[4851]: I1001 14:33:51.907343 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cvgl7" podUID="2e4421cd-0461-44ee-b937-b12e4fb50766" containerName="registry-server" probeResult="failure" output=< Oct 01 14:33:51 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 14:33:51 crc kubenswrapper[4851]: > Oct 01 14:34:01 crc kubenswrapper[4851]: I1001 14:34:01.897125 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cvgl7" podUID="2e4421cd-0461-44ee-b937-b12e4fb50766" containerName="registry-server" probeResult="failure" output=< Oct 01 14:34:01 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 14:34:01 crc kubenswrapper[4851]: > Oct 01 14:34:03 crc kubenswrapper[4851]: I1001 14:34:03.329038 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:34:03 crc kubenswrapper[4851]: E1001 14:34:03.329398 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:34:10 crc kubenswrapper[4851]: I1001 14:34:10.947103 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:34:11 crc kubenswrapper[4851]: I1001 14:34:11.002944 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:34:11 crc kubenswrapper[4851]: I1001 14:34:11.185908 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cvgl7"] Oct 01 14:34:12 crc kubenswrapper[4851]: I1001 14:34:12.067721 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cvgl7" podUID="2e4421cd-0461-44ee-b937-b12e4fb50766" containerName="registry-server" containerID="cri-o://6c4636703de834ae207627751b5cfb92432fa2c43f0ba2c50d84ec6ce26ec6c4" gracePeriod=2 Oct 01 14:34:13 crc kubenswrapper[4851]: I1001 14:34:13.082860 4851 generic.go:334] "Generic (PLEG): container finished" podID="2e4421cd-0461-44ee-b937-b12e4fb50766" containerID="6c4636703de834ae207627751b5cfb92432fa2c43f0ba2c50d84ec6ce26ec6c4" exitCode=0 Oct 01 14:34:13 crc kubenswrapper[4851]: I1001 14:34:13.083127 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvgl7" event={"ID":"2e4421cd-0461-44ee-b937-b12e4fb50766","Type":"ContainerDied","Data":"6c4636703de834ae207627751b5cfb92432fa2c43f0ba2c50d84ec6ce26ec6c4"} Oct 01 14:34:13 crc kubenswrapper[4851]: I1001 14:34:13.897254 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.025965 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4421cd-0461-44ee-b937-b12e4fb50766-utilities\") pod \"2e4421cd-0461-44ee-b937-b12e4fb50766\" (UID: \"2e4421cd-0461-44ee-b937-b12e4fb50766\") " Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.026113 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57pd6\" (UniqueName: \"kubernetes.io/projected/2e4421cd-0461-44ee-b937-b12e4fb50766-kube-api-access-57pd6\") pod \"2e4421cd-0461-44ee-b937-b12e4fb50766\" (UID: \"2e4421cd-0461-44ee-b937-b12e4fb50766\") " Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.026287 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4421cd-0461-44ee-b937-b12e4fb50766-catalog-content\") pod \"2e4421cd-0461-44ee-b937-b12e4fb50766\" (UID: \"2e4421cd-0461-44ee-b937-b12e4fb50766\") " Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.026743 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e4421cd-0461-44ee-b937-b12e4fb50766-utilities" (OuterVolumeSpecName: "utilities") pod "2e4421cd-0461-44ee-b937-b12e4fb50766" (UID: "2e4421cd-0461-44ee-b937-b12e4fb50766"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.027033 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4421cd-0461-44ee-b937-b12e4fb50766-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.035867 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4421cd-0461-44ee-b937-b12e4fb50766-kube-api-access-57pd6" (OuterVolumeSpecName: "kube-api-access-57pd6") pod "2e4421cd-0461-44ee-b937-b12e4fb50766" (UID: "2e4421cd-0461-44ee-b937-b12e4fb50766"). InnerVolumeSpecName "kube-api-access-57pd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.095593 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvgl7" event={"ID":"2e4421cd-0461-44ee-b937-b12e4fb50766","Type":"ContainerDied","Data":"8c862bfe47bcdafd64beaa4c3e59b695abcbb5a13ce1036d09a0272b9f0bda54"} Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.095650 4851 scope.go:117] "RemoveContainer" containerID="6c4636703de834ae207627751b5cfb92432fa2c43f0ba2c50d84ec6ce26ec6c4" Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.095799 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cvgl7" Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.129045 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57pd6\" (UniqueName: \"kubernetes.io/projected/2e4421cd-0461-44ee-b937-b12e4fb50766-kube-api-access-57pd6\") on node \"crc\" DevicePath \"\"" Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.131472 4851 scope.go:117] "RemoveContainer" containerID="75565a2cd1f594bbef331f49ceae9be67237f15992070d67ef336bf64e8b3e94" Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.137868 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e4421cd-0461-44ee-b937-b12e4fb50766-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e4421cd-0461-44ee-b937-b12e4fb50766" (UID: "2e4421cd-0461-44ee-b937-b12e4fb50766"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.156607 4851 scope.go:117] "RemoveContainer" containerID="bc81e4439a61df1a94dab2657b71ec3e15003a0ac0db35947512432a985dd74d" Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.231384 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4421cd-0461-44ee-b937-b12e4fb50766-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.422013 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cvgl7"] Oct 01 14:34:14 crc kubenswrapper[4851]: I1001 14:34:14.432441 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cvgl7"] Oct 01 14:34:15 crc kubenswrapper[4851]: I1001 14:34:15.328124 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:34:15 crc kubenswrapper[4851]: E1001 14:34:15.328595 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:34:16 crc kubenswrapper[4851]: I1001 14:34:16.342872 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e4421cd-0461-44ee-b937-b12e4fb50766" path="/var/lib/kubelet/pods/2e4421cd-0461-44ee-b937-b12e4fb50766/volumes" Oct 01 14:34:28 crc kubenswrapper[4851]: I1001 14:34:28.329340 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:34:28 crc kubenswrapper[4851]: E1001 14:34:28.330868 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:34:39 crc kubenswrapper[4851]: I1001 14:34:39.341291 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:34:40 crc kubenswrapper[4851]: I1001 14:34:40.380572 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"85553da2e34a2e31e9d566f2f011387e340834a9e5c5cb46a4d8e62afb7ef26e"} Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.459394 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ht9nx"] Oct 01 14:34:53 crc kubenswrapper[4851]: E1001 14:34:53.460662 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4421cd-0461-44ee-b937-b12e4fb50766" containerName="registry-server" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.460780 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4421cd-0461-44ee-b937-b12e4fb50766" containerName="registry-server" Oct 01 14:34:53 crc kubenswrapper[4851]: E1001 14:34:53.460796 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4421cd-0461-44ee-b937-b12e4fb50766" containerName="extract-utilities" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.460805 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4421cd-0461-44ee-b937-b12e4fb50766" containerName="extract-utilities" Oct 01 14:34:53 crc kubenswrapper[4851]: E1001 14:34:53.460827 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4421cd-0461-44ee-b937-b12e4fb50766" containerName="extract-content" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.460836 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4421cd-0461-44ee-b937-b12e4fb50766" containerName="extract-content" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.461093 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e4421cd-0461-44ee-b937-b12e4fb50766" containerName="registry-server" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.463425 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.476626 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ht9nx"] Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.508540 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd4w6\" (UniqueName: \"kubernetes.io/projected/383f2e61-3243-4838-82e1-2ef6efa0979d-kube-api-access-gd4w6\") pod \"certified-operators-ht9nx\" (UID: \"383f2e61-3243-4838-82e1-2ef6efa0979d\") " pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.508603 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/383f2e61-3243-4838-82e1-2ef6efa0979d-catalog-content\") pod \"certified-operators-ht9nx\" (UID: \"383f2e61-3243-4838-82e1-2ef6efa0979d\") " pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.508707 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/383f2e61-3243-4838-82e1-2ef6efa0979d-utilities\") pod \"certified-operators-ht9nx\" (UID: \"383f2e61-3243-4838-82e1-2ef6efa0979d\") " pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.611310 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd4w6\" (UniqueName: \"kubernetes.io/projected/383f2e61-3243-4838-82e1-2ef6efa0979d-kube-api-access-gd4w6\") pod \"certified-operators-ht9nx\" (UID: \"383f2e61-3243-4838-82e1-2ef6efa0979d\") " pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.611384 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/383f2e61-3243-4838-82e1-2ef6efa0979d-catalog-content\") pod \"certified-operators-ht9nx\" (UID: \"383f2e61-3243-4838-82e1-2ef6efa0979d\") " pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.611463 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/383f2e61-3243-4838-82e1-2ef6efa0979d-utilities\") pod \"certified-operators-ht9nx\" (UID: \"383f2e61-3243-4838-82e1-2ef6efa0979d\") " pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.612093 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/383f2e61-3243-4838-82e1-2ef6efa0979d-catalog-content\") pod \"certified-operators-ht9nx\" (UID: \"383f2e61-3243-4838-82e1-2ef6efa0979d\") " pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.612153 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/383f2e61-3243-4838-82e1-2ef6efa0979d-utilities\") pod \"certified-operators-ht9nx\" (UID: \"383f2e61-3243-4838-82e1-2ef6efa0979d\") " pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.650170 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd4w6\" (UniqueName: \"kubernetes.io/projected/383f2e61-3243-4838-82e1-2ef6efa0979d-kube-api-access-gd4w6\") pod \"certified-operators-ht9nx\" (UID: \"383f2e61-3243-4838-82e1-2ef6efa0979d\") " pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:34:53 crc kubenswrapper[4851]: I1001 14:34:53.802681 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:34:54 crc kubenswrapper[4851]: W1001 14:34:54.376568 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod383f2e61_3243_4838_82e1_2ef6efa0979d.slice/crio-f9b783035efda50657adf3be5bca0c827f6ca05f32c5e7ec00e55205e8bd6caa WatchSource:0}: Error finding container f9b783035efda50657adf3be5bca0c827f6ca05f32c5e7ec00e55205e8bd6caa: Status 404 returned error can't find the container with id f9b783035efda50657adf3be5bca0c827f6ca05f32c5e7ec00e55205e8bd6caa Oct 01 14:34:54 crc kubenswrapper[4851]: I1001 14:34:54.380989 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ht9nx"] Oct 01 14:34:54 crc kubenswrapper[4851]: I1001 14:34:54.535434 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ht9nx" event={"ID":"383f2e61-3243-4838-82e1-2ef6efa0979d","Type":"ContainerStarted","Data":"f9b783035efda50657adf3be5bca0c827f6ca05f32c5e7ec00e55205e8bd6caa"} Oct 01 14:34:55 crc kubenswrapper[4851]: I1001 14:34:55.548036 4851 generic.go:334] "Generic (PLEG): container finished" podID="383f2e61-3243-4838-82e1-2ef6efa0979d" containerID="8c1f1867585d0ca53eee0af27881b0d5bef7921cdf1c7c74ef05de040aa6d9a0" exitCode=0 Oct 01 14:34:55 crc kubenswrapper[4851]: I1001 14:34:55.548322 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ht9nx" event={"ID":"383f2e61-3243-4838-82e1-2ef6efa0979d","Type":"ContainerDied","Data":"8c1f1867585d0ca53eee0af27881b0d5bef7921cdf1c7c74ef05de040aa6d9a0"} Oct 01 14:34:55 crc kubenswrapper[4851]: I1001 14:34:55.550662 4851 generic.go:334] "Generic (PLEG): container finished" podID="58db37d1-3709-42bc-af9c-dc416e1c4654" containerID="24cb527df17535b6fa3e2f033621745f2eed1ce472df593a611a03ea04175e60" exitCode=0 Oct 01 14:34:55 crc kubenswrapper[4851]: I1001 14:34:55.550702 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6dhv/crc-debug-fvwgv" event={"ID":"58db37d1-3709-42bc-af9c-dc416e1c4654","Type":"ContainerDied","Data":"24cb527df17535b6fa3e2f033621745f2eed1ce472df593a611a03ea04175e60"} Oct 01 14:34:55 crc kubenswrapper[4851]: I1001 14:34:55.551180 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:34:56 crc kubenswrapper[4851]: I1001 14:34:56.672352 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/crc-debug-fvwgv" Oct 01 14:34:56 crc kubenswrapper[4851]: I1001 14:34:56.711283 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t6dhv/crc-debug-fvwgv"] Oct 01 14:34:56 crc kubenswrapper[4851]: I1001 14:34:56.723160 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t6dhv/crc-debug-fvwgv"] Oct 01 14:34:56 crc kubenswrapper[4851]: I1001 14:34:56.795721 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6g9v\" (UniqueName: \"kubernetes.io/projected/58db37d1-3709-42bc-af9c-dc416e1c4654-kube-api-access-v6g9v\") pod \"58db37d1-3709-42bc-af9c-dc416e1c4654\" (UID: \"58db37d1-3709-42bc-af9c-dc416e1c4654\") " Oct 01 14:34:56 crc kubenswrapper[4851]: I1001 14:34:56.795978 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58db37d1-3709-42bc-af9c-dc416e1c4654-host\") pod \"58db37d1-3709-42bc-af9c-dc416e1c4654\" (UID: \"58db37d1-3709-42bc-af9c-dc416e1c4654\") " Oct 01 14:34:56 crc kubenswrapper[4851]: I1001 14:34:56.796077 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58db37d1-3709-42bc-af9c-dc416e1c4654-host" (OuterVolumeSpecName: "host") pod "58db37d1-3709-42bc-af9c-dc416e1c4654" (UID: "58db37d1-3709-42bc-af9c-dc416e1c4654"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:34:56 crc kubenswrapper[4851]: I1001 14:34:56.796705 4851 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58db37d1-3709-42bc-af9c-dc416e1c4654-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:34:56 crc kubenswrapper[4851]: I1001 14:34:56.803527 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58db37d1-3709-42bc-af9c-dc416e1c4654-kube-api-access-v6g9v" (OuterVolumeSpecName: "kube-api-access-v6g9v") pod "58db37d1-3709-42bc-af9c-dc416e1c4654" (UID: "58db37d1-3709-42bc-af9c-dc416e1c4654"). InnerVolumeSpecName "kube-api-access-v6g9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:34:56 crc kubenswrapper[4851]: I1001 14:34:56.899332 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6g9v\" (UniqueName: \"kubernetes.io/projected/58db37d1-3709-42bc-af9c-dc416e1c4654-kube-api-access-v6g9v\") on node \"crc\" DevicePath \"\"" Oct 01 14:34:57 crc kubenswrapper[4851]: I1001 14:34:57.577386 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa3e361229bc76432d915f626fee3e8690f70a51c908352ca304875f48084c8a" Oct 01 14:34:57 crc kubenswrapper[4851]: I1001 14:34:57.577667 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/crc-debug-fvwgv" Oct 01 14:34:57 crc kubenswrapper[4851]: I1001 14:34:57.923266 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6dhv/crc-debug-5ggjj"] Oct 01 14:34:57 crc kubenswrapper[4851]: E1001 14:34:57.924158 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58db37d1-3709-42bc-af9c-dc416e1c4654" containerName="container-00" Oct 01 14:34:57 crc kubenswrapper[4851]: I1001 14:34:57.924188 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="58db37d1-3709-42bc-af9c-dc416e1c4654" containerName="container-00" Oct 01 14:34:57 crc kubenswrapper[4851]: I1001 14:34:57.924638 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="58db37d1-3709-42bc-af9c-dc416e1c4654" containerName="container-00" Oct 01 14:34:57 crc kubenswrapper[4851]: I1001 14:34:57.925547 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" Oct 01 14:34:58 crc kubenswrapper[4851]: I1001 14:34:58.021066 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3096019a-f2c5-43b3-a7cf-56cfd4d792ad-host\") pod \"crc-debug-5ggjj\" (UID: \"3096019a-f2c5-43b3-a7cf-56cfd4d792ad\") " pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" Oct 01 14:34:58 crc kubenswrapper[4851]: I1001 14:34:58.021182 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm9nk\" (UniqueName: \"kubernetes.io/projected/3096019a-f2c5-43b3-a7cf-56cfd4d792ad-kube-api-access-tm9nk\") pod \"crc-debug-5ggjj\" (UID: \"3096019a-f2c5-43b3-a7cf-56cfd4d792ad\") " pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" Oct 01 14:34:58 crc kubenswrapper[4851]: I1001 14:34:58.122837 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3096019a-f2c5-43b3-a7cf-56cfd4d792ad-host\") pod \"crc-debug-5ggjj\" (UID: \"3096019a-f2c5-43b3-a7cf-56cfd4d792ad\") " pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" Oct 01 14:34:58 crc kubenswrapper[4851]: I1001 14:34:58.123001 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm9nk\" (UniqueName: \"kubernetes.io/projected/3096019a-f2c5-43b3-a7cf-56cfd4d792ad-kube-api-access-tm9nk\") pod \"crc-debug-5ggjj\" (UID: \"3096019a-f2c5-43b3-a7cf-56cfd4d792ad\") " pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" Oct 01 14:34:58 crc kubenswrapper[4851]: I1001 14:34:58.123637 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3096019a-f2c5-43b3-a7cf-56cfd4d792ad-host\") pod \"crc-debug-5ggjj\" (UID: \"3096019a-f2c5-43b3-a7cf-56cfd4d792ad\") " pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" Oct 01 14:34:58 crc kubenswrapper[4851]: I1001 14:34:58.145368 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm9nk\" (UniqueName: \"kubernetes.io/projected/3096019a-f2c5-43b3-a7cf-56cfd4d792ad-kube-api-access-tm9nk\") pod \"crc-debug-5ggjj\" (UID: \"3096019a-f2c5-43b3-a7cf-56cfd4d792ad\") " pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" Oct 01 14:34:58 crc kubenswrapper[4851]: I1001 14:34:58.245709 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" Oct 01 14:34:58 crc kubenswrapper[4851]: W1001 14:34:58.290773 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3096019a_f2c5_43b3_a7cf_56cfd4d792ad.slice/crio-2b2b00430f8120aa14ad1d1568576bfbb801e71a0d67014e21bb471968b91fe0 WatchSource:0}: Error finding container 2b2b00430f8120aa14ad1d1568576bfbb801e71a0d67014e21bb471968b91fe0: Status 404 returned error can't find the container with id 2b2b00430f8120aa14ad1d1568576bfbb801e71a0d67014e21bb471968b91fe0 Oct 01 14:34:58 crc kubenswrapper[4851]: I1001 14:34:58.341099 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58db37d1-3709-42bc-af9c-dc416e1c4654" path="/var/lib/kubelet/pods/58db37d1-3709-42bc-af9c-dc416e1c4654/volumes" Oct 01 14:34:58 crc kubenswrapper[4851]: I1001 14:34:58.595051 4851 generic.go:334] "Generic (PLEG): container finished" podID="383f2e61-3243-4838-82e1-2ef6efa0979d" containerID="0f5d2eb0171f6e9348c6b5558f242c505de84fe7707da4870d3e361c4b07cf43" exitCode=0 Oct 01 14:34:58 crc kubenswrapper[4851]: I1001 14:34:58.595125 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ht9nx" event={"ID":"383f2e61-3243-4838-82e1-2ef6efa0979d","Type":"ContainerDied","Data":"0f5d2eb0171f6e9348c6b5558f242c505de84fe7707da4870d3e361c4b07cf43"} Oct 01 14:34:58 crc kubenswrapper[4851]: I1001 14:34:58.597287 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" event={"ID":"3096019a-f2c5-43b3-a7cf-56cfd4d792ad","Type":"ContainerStarted","Data":"2b2b00430f8120aa14ad1d1568576bfbb801e71a0d67014e21bb471968b91fe0"} Oct 01 14:34:59 crc kubenswrapper[4851]: I1001 14:34:59.611999 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" event={"ID":"3096019a-f2c5-43b3-a7cf-56cfd4d792ad","Type":"ContainerStarted","Data":"7c16113daaf420bf0870b43b6439fa4c8f41ae1e48a5ff3ee8761bed60da3f09"} Oct 01 14:34:59 crc kubenswrapper[4851]: I1001 14:34:59.633531 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" podStartSLOduration=2.6334859870000003 podStartE2EDuration="2.633485987s" podCreationTimestamp="2025-10-01 14:34:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:34:59.629780021 +0000 UTC m=+6107.974897517" watchObservedRunningTime="2025-10-01 14:34:59.633485987 +0000 UTC m=+6107.978603483" Oct 01 14:35:00 crc kubenswrapper[4851]: I1001 14:35:00.626413 4851 generic.go:334] "Generic (PLEG): container finished" podID="3096019a-f2c5-43b3-a7cf-56cfd4d792ad" containerID="7c16113daaf420bf0870b43b6439fa4c8f41ae1e48a5ff3ee8761bed60da3f09" exitCode=0 Oct 01 14:35:00 crc kubenswrapper[4851]: I1001 14:35:00.626749 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" event={"ID":"3096019a-f2c5-43b3-a7cf-56cfd4d792ad","Type":"ContainerDied","Data":"7c16113daaf420bf0870b43b6439fa4c8f41ae1e48a5ff3ee8761bed60da3f09"} Oct 01 14:35:01 crc kubenswrapper[4851]: I1001 14:35:01.641510 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ht9nx" event={"ID":"383f2e61-3243-4838-82e1-2ef6efa0979d","Type":"ContainerStarted","Data":"dda7b2e73e8286df7c4992db34c39f546aebdca77cd2fd1e003188177483b3e5"} Oct 01 14:35:01 crc kubenswrapper[4851]: I1001 14:35:01.672442 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ht9nx" podStartSLOduration=3.619919825 podStartE2EDuration="8.672426753s" podCreationTimestamp="2025-10-01 14:34:53 +0000 UTC" firstStartedPulling="2025-10-01 14:34:55.550922679 +0000 UTC m=+6103.896040175" lastFinishedPulling="2025-10-01 14:35:00.603429617 +0000 UTC m=+6108.948547103" observedRunningTime="2025-10-01 14:35:01.668572604 +0000 UTC m=+6110.013690100" watchObservedRunningTime="2025-10-01 14:35:01.672426753 +0000 UTC m=+6110.017544239" Oct 01 14:35:01 crc kubenswrapper[4851]: I1001 14:35:01.747848 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" Oct 01 14:35:01 crc kubenswrapper[4851]: I1001 14:35:01.802915 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3096019a-f2c5-43b3-a7cf-56cfd4d792ad-host\") pod \"3096019a-f2c5-43b3-a7cf-56cfd4d792ad\" (UID: \"3096019a-f2c5-43b3-a7cf-56cfd4d792ad\") " Oct 01 14:35:01 crc kubenswrapper[4851]: I1001 14:35:01.803009 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3096019a-f2c5-43b3-a7cf-56cfd4d792ad-host" (OuterVolumeSpecName: "host") pod "3096019a-f2c5-43b3-a7cf-56cfd4d792ad" (UID: "3096019a-f2c5-43b3-a7cf-56cfd4d792ad"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:35:01 crc kubenswrapper[4851]: I1001 14:35:01.803049 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm9nk\" (UniqueName: \"kubernetes.io/projected/3096019a-f2c5-43b3-a7cf-56cfd4d792ad-kube-api-access-tm9nk\") pod \"3096019a-f2c5-43b3-a7cf-56cfd4d792ad\" (UID: \"3096019a-f2c5-43b3-a7cf-56cfd4d792ad\") " Oct 01 14:35:01 crc kubenswrapper[4851]: I1001 14:35:01.803416 4851 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3096019a-f2c5-43b3-a7cf-56cfd4d792ad-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:35:01 crc kubenswrapper[4851]: I1001 14:35:01.813398 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3096019a-f2c5-43b3-a7cf-56cfd4d792ad-kube-api-access-tm9nk" (OuterVolumeSpecName: "kube-api-access-tm9nk") pod "3096019a-f2c5-43b3-a7cf-56cfd4d792ad" (UID: "3096019a-f2c5-43b3-a7cf-56cfd4d792ad"). InnerVolumeSpecName "kube-api-access-tm9nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:35:01 crc kubenswrapper[4851]: I1001 14:35:01.904814 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm9nk\" (UniqueName: \"kubernetes.io/projected/3096019a-f2c5-43b3-a7cf-56cfd4d792ad-kube-api-access-tm9nk\") on node \"crc\" DevicePath \"\"" Oct 01 14:35:02 crc kubenswrapper[4851]: I1001 14:35:02.650408 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" event={"ID":"3096019a-f2c5-43b3-a7cf-56cfd4d792ad","Type":"ContainerDied","Data":"2b2b00430f8120aa14ad1d1568576bfbb801e71a0d67014e21bb471968b91fe0"} Oct 01 14:35:02 crc kubenswrapper[4851]: I1001 14:35:02.650765 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b2b00430f8120aa14ad1d1568576bfbb801e71a0d67014e21bb471968b91fe0" Oct 01 14:35:02 crc kubenswrapper[4851]: I1001 14:35:02.650452 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/crc-debug-5ggjj" Oct 01 14:35:03 crc kubenswrapper[4851]: I1001 14:35:03.802919 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:35:03 crc kubenswrapper[4851]: I1001 14:35:03.802961 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:35:03 crc kubenswrapper[4851]: I1001 14:35:03.870958 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:35:09 crc kubenswrapper[4851]: I1001 14:35:09.103014 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t6dhv/crc-debug-5ggjj"] Oct 01 14:35:09 crc kubenswrapper[4851]: I1001 14:35:09.113873 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t6dhv/crc-debug-5ggjj"] Oct 01 14:35:10 crc kubenswrapper[4851]: I1001 14:35:10.264399 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6dhv/crc-debug-2t6zh"] Oct 01 14:35:10 crc kubenswrapper[4851]: E1001 14:35:10.265077 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3096019a-f2c5-43b3-a7cf-56cfd4d792ad" containerName="container-00" Oct 01 14:35:10 crc kubenswrapper[4851]: I1001 14:35:10.265093 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="3096019a-f2c5-43b3-a7cf-56cfd4d792ad" containerName="container-00" Oct 01 14:35:10 crc kubenswrapper[4851]: I1001 14:35:10.265283 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="3096019a-f2c5-43b3-a7cf-56cfd4d792ad" containerName="container-00" Oct 01 14:35:10 crc kubenswrapper[4851]: I1001 14:35:10.266006 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/crc-debug-2t6zh" Oct 01 14:35:10 crc kubenswrapper[4851]: I1001 14:35:10.339988 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3096019a-f2c5-43b3-a7cf-56cfd4d792ad" path="/var/lib/kubelet/pods/3096019a-f2c5-43b3-a7cf-56cfd4d792ad/volumes" Oct 01 14:35:10 crc kubenswrapper[4851]: I1001 14:35:10.342385 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51635154-e9ef-401b-ad5d-bdebe4cda8fc-host\") pod \"crc-debug-2t6zh\" (UID: \"51635154-e9ef-401b-ad5d-bdebe4cda8fc\") " pod="openshift-must-gather-t6dhv/crc-debug-2t6zh" Oct 01 14:35:10 crc kubenswrapper[4851]: I1001 14:35:10.342552 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4njnf\" (UniqueName: \"kubernetes.io/projected/51635154-e9ef-401b-ad5d-bdebe4cda8fc-kube-api-access-4njnf\") pod \"crc-debug-2t6zh\" (UID: \"51635154-e9ef-401b-ad5d-bdebe4cda8fc\") " pod="openshift-must-gather-t6dhv/crc-debug-2t6zh" Oct 01 14:35:10 crc kubenswrapper[4851]: I1001 14:35:10.444866 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51635154-e9ef-401b-ad5d-bdebe4cda8fc-host\") pod \"crc-debug-2t6zh\" (UID: \"51635154-e9ef-401b-ad5d-bdebe4cda8fc\") " pod="openshift-must-gather-t6dhv/crc-debug-2t6zh" Oct 01 14:35:10 crc kubenswrapper[4851]: I1001 14:35:10.445028 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51635154-e9ef-401b-ad5d-bdebe4cda8fc-host\") pod \"crc-debug-2t6zh\" (UID: \"51635154-e9ef-401b-ad5d-bdebe4cda8fc\") " pod="openshift-must-gather-t6dhv/crc-debug-2t6zh" Oct 01 14:35:10 crc kubenswrapper[4851]: I1001 14:35:10.445088 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4njnf\" (UniqueName: \"kubernetes.io/projected/51635154-e9ef-401b-ad5d-bdebe4cda8fc-kube-api-access-4njnf\") pod \"crc-debug-2t6zh\" (UID: \"51635154-e9ef-401b-ad5d-bdebe4cda8fc\") " pod="openshift-must-gather-t6dhv/crc-debug-2t6zh" Oct 01 14:35:10 crc kubenswrapper[4851]: I1001 14:35:10.474371 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4njnf\" (UniqueName: \"kubernetes.io/projected/51635154-e9ef-401b-ad5d-bdebe4cda8fc-kube-api-access-4njnf\") pod \"crc-debug-2t6zh\" (UID: \"51635154-e9ef-401b-ad5d-bdebe4cda8fc\") " pod="openshift-must-gather-t6dhv/crc-debug-2t6zh" Oct 01 14:35:10 crc kubenswrapper[4851]: I1001 14:35:10.582622 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/crc-debug-2t6zh" Oct 01 14:35:10 crc kubenswrapper[4851]: I1001 14:35:10.727069 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6dhv/crc-debug-2t6zh" event={"ID":"51635154-e9ef-401b-ad5d-bdebe4cda8fc","Type":"ContainerStarted","Data":"4d336f3a83861d0c1940077337943936f6b8e5f9a6d38a8719bdd1af7f5af8df"} Oct 01 14:35:11 crc kubenswrapper[4851]: I1001 14:35:11.741790 4851 generic.go:334] "Generic (PLEG): container finished" podID="51635154-e9ef-401b-ad5d-bdebe4cda8fc" containerID="876a3250d8e93e05d36b3bc8cbaa6e8b88cfd22837503eb5ba78def9ab381c33" exitCode=0 Oct 01 14:35:11 crc kubenswrapper[4851]: I1001 14:35:11.741890 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6dhv/crc-debug-2t6zh" event={"ID":"51635154-e9ef-401b-ad5d-bdebe4cda8fc","Type":"ContainerDied","Data":"876a3250d8e93e05d36b3bc8cbaa6e8b88cfd22837503eb5ba78def9ab381c33"} Oct 01 14:35:11 crc kubenswrapper[4851]: I1001 14:35:11.786218 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t6dhv/crc-debug-2t6zh"] Oct 01 14:35:11 crc kubenswrapper[4851]: I1001 14:35:11.800359 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t6dhv/crc-debug-2t6zh"] Oct 01 14:35:12 crc kubenswrapper[4851]: I1001 14:35:12.883170 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/crc-debug-2t6zh" Oct 01 14:35:12 crc kubenswrapper[4851]: I1001 14:35:12.897017 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4njnf\" (UniqueName: \"kubernetes.io/projected/51635154-e9ef-401b-ad5d-bdebe4cda8fc-kube-api-access-4njnf\") pod \"51635154-e9ef-401b-ad5d-bdebe4cda8fc\" (UID: \"51635154-e9ef-401b-ad5d-bdebe4cda8fc\") " Oct 01 14:35:12 crc kubenswrapper[4851]: I1001 14:35:12.897285 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51635154-e9ef-401b-ad5d-bdebe4cda8fc-host\") pod \"51635154-e9ef-401b-ad5d-bdebe4cda8fc\" (UID: \"51635154-e9ef-401b-ad5d-bdebe4cda8fc\") " Oct 01 14:35:12 crc kubenswrapper[4851]: I1001 14:35:12.897391 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51635154-e9ef-401b-ad5d-bdebe4cda8fc-host" (OuterVolumeSpecName: "host") pod "51635154-e9ef-401b-ad5d-bdebe4cda8fc" (UID: "51635154-e9ef-401b-ad5d-bdebe4cda8fc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:35:12 crc kubenswrapper[4851]: I1001 14:35:12.897979 4851 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51635154-e9ef-401b-ad5d-bdebe4cda8fc-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:35:12 crc kubenswrapper[4851]: I1001 14:35:12.905869 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51635154-e9ef-401b-ad5d-bdebe4cda8fc-kube-api-access-4njnf" (OuterVolumeSpecName: "kube-api-access-4njnf") pod "51635154-e9ef-401b-ad5d-bdebe4cda8fc" (UID: "51635154-e9ef-401b-ad5d-bdebe4cda8fc"). InnerVolumeSpecName "kube-api-access-4njnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:35:12 crc kubenswrapper[4851]: I1001 14:35:12.999746 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4njnf\" (UniqueName: \"kubernetes.io/projected/51635154-e9ef-401b-ad5d-bdebe4cda8fc-kube-api-access-4njnf\") on node \"crc\" DevicePath \"\"" Oct 01 14:35:13 crc kubenswrapper[4851]: I1001 14:35:13.525408 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc_b2793519-7741-4285-b90b-2210f2c7a421/util/0.log" Oct 01 14:35:13 crc kubenswrapper[4851]: I1001 14:35:13.752029 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc_b2793519-7741-4285-b90b-2210f2c7a421/util/0.log" Oct 01 14:35:13 crc kubenswrapper[4851]: I1001 14:35:13.753424 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc_b2793519-7741-4285-b90b-2210f2c7a421/pull/0.log" Oct 01 14:35:13 crc kubenswrapper[4851]: I1001 14:35:13.761140 4851 scope.go:117] "RemoveContainer" containerID="876a3250d8e93e05d36b3bc8cbaa6e8b88cfd22837503eb5ba78def9ab381c33" Oct 01 14:35:13 crc kubenswrapper[4851]: I1001 14:35:13.761186 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/crc-debug-2t6zh" Oct 01 14:35:13 crc kubenswrapper[4851]: I1001 14:35:13.791706 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc_b2793519-7741-4285-b90b-2210f2c7a421/pull/0.log" Oct 01 14:35:13 crc kubenswrapper[4851]: I1001 14:35:13.867854 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:35:13 crc kubenswrapper[4851]: I1001 14:35:13.920910 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ht9nx"] Oct 01 14:35:13 crc kubenswrapper[4851]: I1001 14:35:13.960078 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc_b2793519-7741-4285-b90b-2210f2c7a421/pull/0.log" Oct 01 14:35:13 crc kubenswrapper[4851]: I1001 14:35:13.967877 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc_b2793519-7741-4285-b90b-2210f2c7a421/util/0.log" Oct 01 14:35:13 crc kubenswrapper[4851]: I1001 14:35:13.997423 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc_b2793519-7741-4285-b90b-2210f2c7a421/extract/0.log" Oct 01 14:35:14 crc kubenswrapper[4851]: I1001 14:35:14.137485 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-nxjn7_613abfd6-27d5-4f52-bad5-024d71335465/kube-rbac-proxy/0.log" Oct 01 14:35:14 crc kubenswrapper[4851]: I1001 14:35:14.200878 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-nxjn7_613abfd6-27d5-4f52-bad5-024d71335465/manager/0.log" Oct 01 14:35:14 crc kubenswrapper[4851]: I1001 14:35:14.238807 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-ms6rc_08cf0ffd-5ec4-4406-a912-319a1c9ced15/kube-rbac-proxy/0.log" Oct 01 14:35:14 crc kubenswrapper[4851]: I1001 14:35:14.341026 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51635154-e9ef-401b-ad5d-bdebe4cda8fc" path="/var/lib/kubelet/pods/51635154-e9ef-401b-ad5d-bdebe4cda8fc/volumes" Oct 01 14:35:14 crc kubenswrapper[4851]: I1001 14:35:14.347211 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-ms6rc_08cf0ffd-5ec4-4406-a912-319a1c9ced15/manager/0.log" Oct 01 14:35:14 crc kubenswrapper[4851]: I1001 14:35:14.389599 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-knmsq_4dcb48dd-5baf-415e-861e-ebfa40fc2e84/kube-rbac-proxy/0.log" Oct 01 14:35:14 crc kubenswrapper[4851]: I1001 14:35:14.465324 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-knmsq_4dcb48dd-5baf-415e-861e-ebfa40fc2e84/manager/0.log" Oct 01 14:35:14 crc kubenswrapper[4851]: I1001 14:35:14.547166 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-8c9zh_a1e24a95-87b4-4f06-b651-9c8c26a7021d/kube-rbac-proxy/0.log" Oct 01 14:35:14 crc kubenswrapper[4851]: I1001 14:35:14.680008 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-8c9zh_a1e24a95-87b4-4f06-b651-9c8c26a7021d/manager/0.log" Oct 01 14:35:14 crc kubenswrapper[4851]: I1001 14:35:14.743423 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-r7gqw_ba805aa7-4c6e-4dc1-8de8-c935ab1c2128/manager/0.log" Oct 01 14:35:14 crc kubenswrapper[4851]: I1001 14:35:14.773594 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-r7gqw_ba805aa7-4c6e-4dc1-8de8-c935ab1c2128/kube-rbac-proxy/0.log" Oct 01 14:35:14 crc kubenswrapper[4851]: I1001 14:35:14.792304 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ht9nx" podUID="383f2e61-3243-4838-82e1-2ef6efa0979d" containerName="registry-server" containerID="cri-o://dda7b2e73e8286df7c4992db34c39f546aebdca77cd2fd1e003188177483b3e5" gracePeriod=2 Oct 01 14:35:14 crc kubenswrapper[4851]: I1001 14:35:14.882367 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-tssnp_a01e01ff-d00e-482d-8901-72c0705672f1/kube-rbac-proxy/0.log" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.030736 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-tssnp_a01e01ff-d00e-482d-8901-72c0705672f1/manager/0.log" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.167899 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-4chzf_63075165-70d9-4dd4-9b52-41e59e59fcee/kube-rbac-proxy/0.log" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.297279 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-dq52c_ac9e1ccb-a68d-446e-b47b-de00d828f332/kube-rbac-proxy/0.log" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.379472 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-4chzf_63075165-70d9-4dd4-9b52-41e59e59fcee/manager/0.log" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.418329 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.432481 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-dq52c_ac9e1ccb-a68d-446e-b47b-de00d828f332/manager/0.log" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.448912 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/383f2e61-3243-4838-82e1-2ef6efa0979d-catalog-content\") pod \"383f2e61-3243-4838-82e1-2ef6efa0979d\" (UID: \"383f2e61-3243-4838-82e1-2ef6efa0979d\") " Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.449057 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd4w6\" (UniqueName: \"kubernetes.io/projected/383f2e61-3243-4838-82e1-2ef6efa0979d-kube-api-access-gd4w6\") pod \"383f2e61-3243-4838-82e1-2ef6efa0979d\" (UID: \"383f2e61-3243-4838-82e1-2ef6efa0979d\") " Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.449171 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/383f2e61-3243-4838-82e1-2ef6efa0979d-utilities\") pod \"383f2e61-3243-4838-82e1-2ef6efa0979d\" (UID: \"383f2e61-3243-4838-82e1-2ef6efa0979d\") " Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.450436 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/383f2e61-3243-4838-82e1-2ef6efa0979d-utilities" (OuterVolumeSpecName: "utilities") pod "383f2e61-3243-4838-82e1-2ef6efa0979d" (UID: "383f2e61-3243-4838-82e1-2ef6efa0979d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.492337 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383f2e61-3243-4838-82e1-2ef6efa0979d-kube-api-access-gd4w6" (OuterVolumeSpecName: "kube-api-access-gd4w6") pod "383f2e61-3243-4838-82e1-2ef6efa0979d" (UID: "383f2e61-3243-4838-82e1-2ef6efa0979d"). InnerVolumeSpecName "kube-api-access-gd4w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.501320 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/383f2e61-3243-4838-82e1-2ef6efa0979d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "383f2e61-3243-4838-82e1-2ef6efa0979d" (UID: "383f2e61-3243-4838-82e1-2ef6efa0979d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.519927 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-flp6z_a7cf8be8-c21f-4904-838e-f185857ef960/kube-rbac-proxy/0.log" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.551343 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/383f2e61-3243-4838-82e1-2ef6efa0979d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.551384 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd4w6\" (UniqueName: \"kubernetes.io/projected/383f2e61-3243-4838-82e1-2ef6efa0979d-kube-api-access-gd4w6\") on node \"crc\" DevicePath \"\"" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.551398 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/383f2e61-3243-4838-82e1-2ef6efa0979d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.696176 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-zx88h_324c39a8-85b5-4caf-a719-a5f47a827d08/kube-rbac-proxy/0.log" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.736946 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-zx88h_324c39a8-85b5-4caf-a719-a5f47a827d08/manager/0.log" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.758079 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-flp6z_a7cf8be8-c21f-4904-838e-f185857ef960/manager/0.log" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.804120 4851 generic.go:334] "Generic (PLEG): container finished" podID="383f2e61-3243-4838-82e1-2ef6efa0979d" containerID="dda7b2e73e8286df7c4992db34c39f546aebdca77cd2fd1e003188177483b3e5" exitCode=0 Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.804177 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ht9nx" event={"ID":"383f2e61-3243-4838-82e1-2ef6efa0979d","Type":"ContainerDied","Data":"dda7b2e73e8286df7c4992db34c39f546aebdca77cd2fd1e003188177483b3e5"} Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.804210 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ht9nx" event={"ID":"383f2e61-3243-4838-82e1-2ef6efa0979d","Type":"ContainerDied","Data":"f9b783035efda50657adf3be5bca0c827f6ca05f32c5e7ec00e55205e8bd6caa"} Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.804206 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ht9nx" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.804226 4851 scope.go:117] "RemoveContainer" containerID="dda7b2e73e8286df7c4992db34c39f546aebdca77cd2fd1e003188177483b3e5" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.830725 4851 scope.go:117] "RemoveContainer" containerID="0f5d2eb0171f6e9348c6b5558f242c505de84fe7707da4870d3e361c4b07cf43" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.843465 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ht9nx"] Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.852677 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ht9nx"] Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.863815 4851 scope.go:117] "RemoveContainer" containerID="8c1f1867585d0ca53eee0af27881b0d5bef7921cdf1c7c74ef05de040aa6d9a0" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.920757 4851 scope.go:117] "RemoveContainer" containerID="dda7b2e73e8286df7c4992db34c39f546aebdca77cd2fd1e003188177483b3e5" Oct 01 14:35:15 crc kubenswrapper[4851]: E1001 14:35:15.921671 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda7b2e73e8286df7c4992db34c39f546aebdca77cd2fd1e003188177483b3e5\": container with ID starting with dda7b2e73e8286df7c4992db34c39f546aebdca77cd2fd1e003188177483b3e5 not found: ID does not exist" containerID="dda7b2e73e8286df7c4992db34c39f546aebdca77cd2fd1e003188177483b3e5" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.921728 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda7b2e73e8286df7c4992db34c39f546aebdca77cd2fd1e003188177483b3e5"} err="failed to get container status \"dda7b2e73e8286df7c4992db34c39f546aebdca77cd2fd1e003188177483b3e5\": rpc error: code = NotFound desc = could not find container \"dda7b2e73e8286df7c4992db34c39f546aebdca77cd2fd1e003188177483b3e5\": container with ID starting with dda7b2e73e8286df7c4992db34c39f546aebdca77cd2fd1e003188177483b3e5 not found: ID does not exist" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.921762 4851 scope.go:117] "RemoveContainer" containerID="0f5d2eb0171f6e9348c6b5558f242c505de84fe7707da4870d3e361c4b07cf43" Oct 01 14:35:15 crc kubenswrapper[4851]: E1001 14:35:15.922185 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f5d2eb0171f6e9348c6b5558f242c505de84fe7707da4870d3e361c4b07cf43\": container with ID starting with 0f5d2eb0171f6e9348c6b5558f242c505de84fe7707da4870d3e361c4b07cf43 not found: ID does not exist" containerID="0f5d2eb0171f6e9348c6b5558f242c505de84fe7707da4870d3e361c4b07cf43" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.922239 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f5d2eb0171f6e9348c6b5558f242c505de84fe7707da4870d3e361c4b07cf43"} err="failed to get container status \"0f5d2eb0171f6e9348c6b5558f242c505de84fe7707da4870d3e361c4b07cf43\": rpc error: code = NotFound desc = could not find container \"0f5d2eb0171f6e9348c6b5558f242c505de84fe7707da4870d3e361c4b07cf43\": container with ID starting with 0f5d2eb0171f6e9348c6b5558f242c505de84fe7707da4870d3e361c4b07cf43 not found: ID does not exist" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.922279 4851 scope.go:117] "RemoveContainer" containerID="8c1f1867585d0ca53eee0af27881b0d5bef7921cdf1c7c74ef05de040aa6d9a0" Oct 01 14:35:15 crc kubenswrapper[4851]: E1001 14:35:15.923320 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c1f1867585d0ca53eee0af27881b0d5bef7921cdf1c7c74ef05de040aa6d9a0\": container with ID starting with 8c1f1867585d0ca53eee0af27881b0d5bef7921cdf1c7c74ef05de040aa6d9a0 not found: ID does not exist" containerID="8c1f1867585d0ca53eee0af27881b0d5bef7921cdf1c7c74ef05de040aa6d9a0" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.923369 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c1f1867585d0ca53eee0af27881b0d5bef7921cdf1c7c74ef05de040aa6d9a0"} err="failed to get container status \"8c1f1867585d0ca53eee0af27881b0d5bef7921cdf1c7c74ef05de040aa6d9a0\": rpc error: code = NotFound desc = could not find container \"8c1f1867585d0ca53eee0af27881b0d5bef7921cdf1c7c74ef05de040aa6d9a0\": container with ID starting with 8c1f1867585d0ca53eee0af27881b0d5bef7921cdf1c7c74ef05de040aa6d9a0 not found: ID does not exist" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.933735 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-qjqpv_de177c71-0c78-416b-a62f-2d73d86a2b70/manager/0.log" Oct 01 14:35:15 crc kubenswrapper[4851]: I1001 14:35:15.990563 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-qjqpv_de177c71-0c78-416b-a62f-2d73d86a2b70/kube-rbac-proxy/0.log" Oct 01 14:35:16 crc kubenswrapper[4851]: I1001 14:35:16.210770 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-22n8s_5d973d57-3aa0-4d14-9c4a-435f6ff880af/kube-rbac-proxy/0.log" Oct 01 14:35:16 crc kubenswrapper[4851]: I1001 14:35:16.289594 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-mkxn2_7abbaca1-f067-43b8-a24e-0219ce7e7eaa/kube-rbac-proxy/0.log" Oct 01 14:35:16 crc kubenswrapper[4851]: I1001 14:35:16.312869 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-22n8s_5d973d57-3aa0-4d14-9c4a-435f6ff880af/manager/0.log" Oct 01 14:35:16 crc kubenswrapper[4851]: I1001 14:35:16.339961 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="383f2e61-3243-4838-82e1-2ef6efa0979d" path="/var/lib/kubelet/pods/383f2e61-3243-4838-82e1-2ef6efa0979d/volumes" Oct 01 14:35:16 crc kubenswrapper[4851]: I1001 14:35:16.493321 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-mkxn2_7abbaca1-f067-43b8-a24e-0219ce7e7eaa/manager/0.log" Oct 01 14:35:16 crc kubenswrapper[4851]: I1001 14:35:16.533536 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-qxlj6_ee6067ef-427a-49d8-99b6-694930e44a0d/kube-rbac-proxy/0.log" Oct 01 14:35:16 crc kubenswrapper[4851]: I1001 14:35:16.607648 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-qxlj6_ee6067ef-427a-49d8-99b6-694930e44a0d/manager/0.log" Oct 01 14:35:16 crc kubenswrapper[4851]: I1001 14:35:16.746570 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8ctmstz_9ac4cdbb-6f06-4525-8f5a-61f81a230708/kube-rbac-proxy/0.log" Oct 01 14:35:16 crc kubenswrapper[4851]: I1001 14:35:16.824250 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8ctmstz_9ac4cdbb-6f06-4525-8f5a-61f81a230708/manager/0.log" Oct 01 14:35:16 crc kubenswrapper[4851]: I1001 14:35:16.898144 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5cc886c7f9-x6rkr_e6e332ad-4dcd-4526-abf5-c79bfce4ee72/kube-rbac-proxy/0.log" Oct 01 14:35:17 crc kubenswrapper[4851]: I1001 14:35:17.097488 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-548cc7d4f7-ll79v_fdc19a2b-5602-40b0-a49d-22d56b9724d7/kube-rbac-proxy/0.log" Oct 01 14:35:17 crc kubenswrapper[4851]: I1001 14:35:17.239590 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-548cc7d4f7-ll79v_fdc19a2b-5602-40b0-a49d-22d56b9724d7/operator/0.log" Oct 01 14:35:17 crc kubenswrapper[4851]: I1001 14:35:17.350951 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-d96ms_8a61efd7-e35e-49f6-880c-e2d18b49d157/registry-server/0.log" Oct 01 14:35:17 crc kubenswrapper[4851]: I1001 14:35:17.549686 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-c8r2z_f3be6466-a46d-49b7-a5e4-9465c82ce165/kube-rbac-proxy/0.log" Oct 01 14:35:17 crc kubenswrapper[4851]: I1001 14:35:17.629822 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-cslt7_d7ddf969-982c-4436-85e4-fb963d57a385/kube-rbac-proxy/0.log" Oct 01 14:35:17 crc kubenswrapper[4851]: I1001 14:35:17.740585 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-c8r2z_f3be6466-a46d-49b7-a5e4-9465c82ce165/manager/0.log" Oct 01 14:35:17 crc kubenswrapper[4851]: I1001 14:35:17.851162 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-cslt7_d7ddf969-982c-4436-85e4-fb963d57a385/manager/0.log" Oct 01 14:35:17 crc kubenswrapper[4851]: I1001 14:35:17.956966 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-26s54_d08d3102-97ef-4224-8a21-fa66c86a2f11/operator/0.log" Oct 01 14:35:18 crc kubenswrapper[4851]: I1001 14:35:18.057051 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-62xmc_9389dca9-fd43-4f83-a3a8-b755859b252e/kube-rbac-proxy/0.log" Oct 01 14:35:18 crc kubenswrapper[4851]: I1001 14:35:18.103056 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-62xmc_9389dca9-fd43-4f83-a3a8-b755859b252e/manager/0.log" Oct 01 14:35:18 crc kubenswrapper[4851]: I1001 14:35:18.236486 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-zzts5_c9b7da57-564a-4053-a476-08db0d87c317/kube-rbac-proxy/0.log" Oct 01 14:35:18 crc kubenswrapper[4851]: I1001 14:35:18.440658 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5cc886c7f9-x6rkr_e6e332ad-4dcd-4526-abf5-c79bfce4ee72/manager/0.log" Oct 01 14:35:18 crc kubenswrapper[4851]: I1001 14:35:18.460728 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-62wb7_5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0/kube-rbac-proxy/0.log" Oct 01 14:35:18 crc kubenswrapper[4851]: I1001 14:35:18.489255 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-62wb7_5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0/manager/0.log" Oct 01 14:35:18 crc kubenswrapper[4851]: I1001 14:35:18.650635 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-d64f8f9f6-2qx7f_60b302ff-f344-4556-92dc-e8f7954b80c9/kube-rbac-proxy/0.log" Oct 01 14:35:18 crc kubenswrapper[4851]: I1001 14:35:18.666247 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-zzts5_c9b7da57-564a-4053-a476-08db0d87c317/manager/0.log" Oct 01 14:35:18 crc kubenswrapper[4851]: I1001 14:35:18.751716 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-d64f8f9f6-2qx7f_60b302ff-f344-4556-92dc-e8f7954b80c9/manager/0.log" Oct 01 14:35:34 crc kubenswrapper[4851]: I1001 14:35:34.150858 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bhp8c_c7ff570b-9b52-4d2f-b030-40eca1804794/control-plane-machine-set-operator/0.log" Oct 01 14:35:34 crc kubenswrapper[4851]: I1001 14:35:34.327780 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-68zzg_0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697/kube-rbac-proxy/0.log" Oct 01 14:35:34 crc kubenswrapper[4851]: I1001 14:35:34.328159 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-68zzg_0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697/machine-api-operator/0.log" Oct 01 14:35:46 crc kubenswrapper[4851]: I1001 14:35:46.397611 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-26dms_435110f0-cfd8-423b-afdc-a9cfe0426a3a/cert-manager-controller/0.log" Oct 01 14:35:46 crc kubenswrapper[4851]: I1001 14:35:46.427228 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-d6clp_f24da60a-9cbf-482f-a764-43145245be50/cert-manager-cainjector/0.log" Oct 01 14:35:46 crc kubenswrapper[4851]: I1001 14:35:46.565323 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-dhkqx_097254a7-5e3a-4d5b-90d5-f6f1ae0b4c3d/cert-manager-webhook/0.log" Oct 01 14:35:58 crc kubenswrapper[4851]: I1001 14:35:58.579568 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-9kpmg_874ffc52-8446-4804-baa4-b75d2348af0d/nmstate-console-plugin/0.log" Oct 01 14:35:58 crc kubenswrapper[4851]: I1001 14:35:58.752231 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-z4ptw_fccb3980-5aa4-4221-8f25-c14c68913a81/nmstate-handler/0.log" Oct 01 14:35:58 crc kubenswrapper[4851]: I1001 14:35:58.869557 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-2bcp8_94133d01-7539-4ee5-9151-62f52ec7a1e8/nmstate-metrics/0.log" Oct 01 14:35:58 crc kubenswrapper[4851]: I1001 14:35:58.870419 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-2bcp8_94133d01-7539-4ee5-9151-62f52ec7a1e8/kube-rbac-proxy/0.log" Oct 01 14:35:59 crc kubenswrapper[4851]: I1001 14:35:59.033764 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-p2vrr_92d2d84c-4ae5-4943-9eeb-f88f900ec658/nmstate-operator/0.log" Oct 01 14:35:59 crc kubenswrapper[4851]: I1001 14:35:59.087983 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-zpvld_f4752bf8-2355-4922-92d2-5546fd2c4340/nmstate-webhook/0.log" Oct 01 14:36:11 crc kubenswrapper[4851]: I1001 14:36:11.876209 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-wrnrn_8516a5e5-0b45-41d2-baa6-2680bc58eb9b/kube-rbac-proxy/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.137718 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-frr-files/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.147476 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-wrnrn_8516a5e5-0b45-41d2-baa6-2680bc58eb9b/controller/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.320829 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-frr-files/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.339711 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-metrics/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.340091 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-reloader/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.343809 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-reloader/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.553964 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-frr-files/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.557585 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-metrics/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.569493 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-reloader/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.579611 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-metrics/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.741340 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-reloader/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.745734 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-metrics/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.760324 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-frr-files/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.801361 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/controller/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.912726 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/frr-metrics/0.log" Oct 01 14:36:12 crc kubenswrapper[4851]: I1001 14:36:12.973431 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/kube-rbac-proxy/0.log" Oct 01 14:36:13 crc kubenswrapper[4851]: I1001 14:36:13.007225 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/kube-rbac-proxy-frr/0.log" Oct 01 14:36:13 crc kubenswrapper[4851]: I1001 14:36:13.171057 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/reloader/0.log" Oct 01 14:36:13 crc kubenswrapper[4851]: I1001 14:36:13.238573 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-wqlkq_35ad8647-b8e1-4935-a669-ae418db665b7/frr-k8s-webhook-server/0.log" Oct 01 14:36:13 crc kubenswrapper[4851]: I1001 14:36:13.443611 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-bc7c7cbf4-vznd6_f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9/manager/0.log" Oct 01 14:36:13 crc kubenswrapper[4851]: I1001 14:36:13.635370 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57f9d579c4-n58rf_f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c/webhook-server/0.log" Oct 01 14:36:13 crc kubenswrapper[4851]: I1001 14:36:13.756160 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bgphr_d8ce22ba-fa18-4924-9c48-360cd16f0857/kube-rbac-proxy/0.log" Oct 01 14:36:14 crc kubenswrapper[4851]: I1001 14:36:14.528528 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bgphr_d8ce22ba-fa18-4924-9c48-360cd16f0857/speaker/0.log" Oct 01 14:36:14 crc kubenswrapper[4851]: I1001 14:36:14.792289 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/frr/0.log" Oct 01 14:36:21 crc kubenswrapper[4851]: I1001 14:36:21.212752 4851 scope.go:117] "RemoveContainer" containerID="4cd96d59a8f9777970789493d643593905d535b99c1710c6a60a7f9d03c84e75" Oct 01 14:36:21 crc kubenswrapper[4851]: I1001 14:36:21.270362 4851 scope.go:117] "RemoveContainer" containerID="501c90412aa176f74a6ba9a47241332e2d6c9365889ee5f0b137ed9ff35f9fb9" Oct 01 14:36:21 crc kubenswrapper[4851]: I1001 14:36:21.363118 4851 scope.go:117] "RemoveContainer" containerID="11d8bc883d5c47ad077422b1b44a68147209b6d6336dcaf07dab3b6cfd7ef30c" Oct 01 14:36:26 crc kubenswrapper[4851]: I1001 14:36:26.436775 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8_127a1393-40c1-4ee6-84a6-9ccd6dee9595/util/0.log" Oct 01 14:36:26 crc kubenswrapper[4851]: I1001 14:36:26.635271 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8_127a1393-40c1-4ee6-84a6-9ccd6dee9595/util/0.log" Oct 01 14:36:26 crc kubenswrapper[4851]: I1001 14:36:26.654714 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8_127a1393-40c1-4ee6-84a6-9ccd6dee9595/pull/0.log" Oct 01 14:36:26 crc kubenswrapper[4851]: I1001 14:36:26.677652 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8_127a1393-40c1-4ee6-84a6-9ccd6dee9595/pull/0.log" Oct 01 14:36:26 crc kubenswrapper[4851]: I1001 14:36:26.844750 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8_127a1393-40c1-4ee6-84a6-9ccd6dee9595/extract/0.log" Oct 01 14:36:26 crc kubenswrapper[4851]: I1001 14:36:26.868326 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8_127a1393-40c1-4ee6-84a6-9ccd6dee9595/pull/0.log" Oct 01 14:36:26 crc kubenswrapper[4851]: I1001 14:36:26.868654 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8_127a1393-40c1-4ee6-84a6-9ccd6dee9595/util/0.log" Oct 01 14:36:27 crc kubenswrapper[4851]: I1001 14:36:27.006991 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_d3aab3b2-075f-4afa-969e-3e32803601b9/util/0.log" Oct 01 14:36:27 crc kubenswrapper[4851]: I1001 14:36:27.199718 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_d3aab3b2-075f-4afa-969e-3e32803601b9/util/0.log" Oct 01 14:36:27 crc kubenswrapper[4851]: I1001 14:36:27.212544 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_d3aab3b2-075f-4afa-969e-3e32803601b9/pull/0.log" Oct 01 14:36:27 crc kubenswrapper[4851]: I1001 14:36:27.212881 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_d3aab3b2-075f-4afa-969e-3e32803601b9/pull/0.log" Oct 01 14:36:27 crc kubenswrapper[4851]: I1001 14:36:27.398869 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_d3aab3b2-075f-4afa-969e-3e32803601b9/util/0.log" Oct 01 14:36:27 crc kubenswrapper[4851]: I1001 14:36:27.424327 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_d3aab3b2-075f-4afa-969e-3e32803601b9/extract/0.log" Oct 01 14:36:27 crc kubenswrapper[4851]: I1001 14:36:27.452136 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_d3aab3b2-075f-4afa-969e-3e32803601b9/pull/0.log" Oct 01 14:36:27 crc kubenswrapper[4851]: I1001 14:36:27.591961 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9mgxt_70558de8-d877-4f24-a9ee-18f3696799a8/extract-utilities/0.log" Oct 01 14:36:27 crc kubenswrapper[4851]: I1001 14:36:27.707144 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9mgxt_70558de8-d877-4f24-a9ee-18f3696799a8/extract-utilities/0.log" Oct 01 14:36:27 crc kubenswrapper[4851]: I1001 14:36:27.730283 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9mgxt_70558de8-d877-4f24-a9ee-18f3696799a8/extract-content/0.log" Oct 01 14:36:27 crc kubenswrapper[4851]: I1001 14:36:27.759971 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9mgxt_70558de8-d877-4f24-a9ee-18f3696799a8/extract-content/0.log" Oct 01 14:36:27 crc kubenswrapper[4851]: I1001 14:36:27.911170 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9mgxt_70558de8-d877-4f24-a9ee-18f3696799a8/extract-utilities/0.log" Oct 01 14:36:27 crc kubenswrapper[4851]: I1001 14:36:27.921108 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9mgxt_70558de8-d877-4f24-a9ee-18f3696799a8/extract-content/0.log" Oct 01 14:36:28 crc kubenswrapper[4851]: I1001 14:36:28.138900 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgwl2_60fbd453-e33b-463e-8dc0-e82251bdec0d/extract-utilities/0.log" Oct 01 14:36:28 crc kubenswrapper[4851]: I1001 14:36:28.334598 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgwl2_60fbd453-e33b-463e-8dc0-e82251bdec0d/extract-utilities/0.log" Oct 01 14:36:28 crc kubenswrapper[4851]: I1001 14:36:28.427913 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgwl2_60fbd453-e33b-463e-8dc0-e82251bdec0d/extract-content/0.log" Oct 01 14:36:28 crc kubenswrapper[4851]: I1001 14:36:28.465717 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgwl2_60fbd453-e33b-463e-8dc0-e82251bdec0d/extract-content/0.log" Oct 01 14:36:28 crc kubenswrapper[4851]: I1001 14:36:28.631869 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgwl2_60fbd453-e33b-463e-8dc0-e82251bdec0d/extract-content/0.log" Oct 01 14:36:28 crc kubenswrapper[4851]: I1001 14:36:28.634221 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgwl2_60fbd453-e33b-463e-8dc0-e82251bdec0d/extract-utilities/0.log" Oct 01 14:36:28 crc kubenswrapper[4851]: I1001 14:36:28.719870 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9mgxt_70558de8-d877-4f24-a9ee-18f3696799a8/registry-server/0.log" Oct 01 14:36:28 crc kubenswrapper[4851]: I1001 14:36:28.829419 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf_ddb9987b-6077-405c-bcc2-15d713fb434d/util/0.log" Oct 01 14:36:29 crc kubenswrapper[4851]: I1001 14:36:29.110473 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf_ddb9987b-6077-405c-bcc2-15d713fb434d/pull/0.log" Oct 01 14:36:29 crc kubenswrapper[4851]: I1001 14:36:29.173976 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf_ddb9987b-6077-405c-bcc2-15d713fb434d/util/0.log" Oct 01 14:36:29 crc kubenswrapper[4851]: I1001 14:36:29.210642 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf_ddb9987b-6077-405c-bcc2-15d713fb434d/pull/0.log" Oct 01 14:36:29 crc kubenswrapper[4851]: I1001 14:36:29.438424 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf_ddb9987b-6077-405c-bcc2-15d713fb434d/util/0.log" Oct 01 14:36:29 crc kubenswrapper[4851]: I1001 14:36:29.450320 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf_ddb9987b-6077-405c-bcc2-15d713fb434d/pull/0.log" Oct 01 14:36:29 crc kubenswrapper[4851]: I1001 14:36:29.513128 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf_ddb9987b-6077-405c-bcc2-15d713fb434d/extract/0.log" Oct 01 14:36:29 crc kubenswrapper[4851]: I1001 14:36:29.752163 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xtrc2_0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d/marketplace-operator/0.log" Oct 01 14:36:29 crc kubenswrapper[4851]: I1001 14:36:29.778641 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgwl2_60fbd453-e33b-463e-8dc0-e82251bdec0d/registry-server/0.log" Oct 01 14:36:29 crc kubenswrapper[4851]: I1001 14:36:29.943458 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-knk85_b535b20d-942b-417e-8580-f29cebe2401f/extract-utilities/0.log" Oct 01 14:36:30 crc kubenswrapper[4851]: I1001 14:36:30.147715 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-knk85_b535b20d-942b-417e-8580-f29cebe2401f/extract-content/0.log" Oct 01 14:36:30 crc kubenswrapper[4851]: I1001 14:36:30.186553 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-knk85_b535b20d-942b-417e-8580-f29cebe2401f/extract-utilities/0.log" Oct 01 14:36:30 crc kubenswrapper[4851]: I1001 14:36:30.226123 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-knk85_b535b20d-942b-417e-8580-f29cebe2401f/extract-content/0.log" Oct 01 14:36:30 crc kubenswrapper[4851]: I1001 14:36:30.354098 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-knk85_b535b20d-942b-417e-8580-f29cebe2401f/extract-utilities/0.log" Oct 01 14:36:30 crc kubenswrapper[4851]: I1001 14:36:30.387318 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-knk85_b535b20d-942b-417e-8580-f29cebe2401f/extract-content/0.log" Oct 01 14:36:30 crc kubenswrapper[4851]: I1001 14:36:30.464786 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnmp_4c1a9063-23de-46a7-bd5e-8763dee075c4/extract-utilities/0.log" Oct 01 14:36:30 crc kubenswrapper[4851]: I1001 14:36:30.589315 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-knk85_b535b20d-942b-417e-8580-f29cebe2401f/registry-server/0.log" Oct 01 14:36:30 crc kubenswrapper[4851]: I1001 14:36:30.622221 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnmp_4c1a9063-23de-46a7-bd5e-8763dee075c4/extract-utilities/0.log" Oct 01 14:36:30 crc kubenswrapper[4851]: I1001 14:36:30.660943 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnmp_4c1a9063-23de-46a7-bd5e-8763dee075c4/extract-content/0.log" Oct 01 14:36:30 crc kubenswrapper[4851]: I1001 14:36:30.715667 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnmp_4c1a9063-23de-46a7-bd5e-8763dee075c4/extract-content/0.log" Oct 01 14:36:30 crc kubenswrapper[4851]: I1001 14:36:30.841905 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnmp_4c1a9063-23de-46a7-bd5e-8763dee075c4/extract-content/0.log" Oct 01 14:36:30 crc kubenswrapper[4851]: I1001 14:36:30.853870 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnmp_4c1a9063-23de-46a7-bd5e-8763dee075c4/extract-utilities/0.log" Oct 01 14:36:31 crc kubenswrapper[4851]: I1001 14:36:31.528991 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnmp_4c1a9063-23de-46a7-bd5e-8763dee075c4/registry-server/0.log" Oct 01 14:36:42 crc kubenswrapper[4851]: I1001 14:36:42.628294 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-2tptp_6dc06d35-7a66-4c6a-bd91-1de635f2160f/prometheus-operator/0.log" Oct 01 14:36:42 crc kubenswrapper[4851]: I1001 14:36:42.813071 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n_fd26d552-d668-4eee-b51c-468b0f48d5e7/prometheus-operator-admission-webhook/0.log" Oct 01 14:36:42 crc kubenswrapper[4851]: I1001 14:36:42.835208 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6dd5789d46-mf765_5797d328-2922-4183-86d5-7d237952df39/prometheus-operator-admission-webhook/0.log" Oct 01 14:36:43 crc kubenswrapper[4851]: I1001 14:36:43.014001 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-t5ld5_6280e32a-cded-420b-8a4b-f65505578cf0/perses-operator/0.log" Oct 01 14:36:43 crc kubenswrapper[4851]: I1001 14:36:43.029692 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-9lt4m_d6d5593d-78e7-4efe-a8b1-f56bd214301b/operator/0.log" Oct 01 14:37:00 crc kubenswrapper[4851]: I1001 14:37:00.050610 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:37:00 crc kubenswrapper[4851]: I1001 14:37:00.051231 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:37:03 crc kubenswrapper[4851]: E1001 14:37:03.889527 4851 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.251:42160->38.102.83.251:42733: write tcp 38.102.83.251:42160->38.102.83.251:42733: write: broken pipe Oct 01 14:37:04 crc kubenswrapper[4851]: E1001 14:37:04.225826 4851 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.251:42058->38.102.83.251:42733: write tcp 38.102.83.251:42058->38.102.83.251:42733: write: broken pipe Oct 01 14:37:30 crc kubenswrapper[4851]: I1001 14:37:30.049521 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:37:30 crc kubenswrapper[4851]: I1001 14:37:30.050014 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:38:00 crc kubenswrapper[4851]: I1001 14:38:00.049792 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:38:00 crc kubenswrapper[4851]: I1001 14:38:00.050374 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:38:00 crc kubenswrapper[4851]: I1001 14:38:00.050420 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 14:38:00 crc kubenswrapper[4851]: I1001 14:38:00.051380 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85553da2e34a2e31e9d566f2f011387e340834a9e5c5cb46a4d8e62afb7ef26e"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:38:00 crc kubenswrapper[4851]: I1001 14:38:00.051466 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://85553da2e34a2e31e9d566f2f011387e340834a9e5c5cb46a4d8e62afb7ef26e" gracePeriod=600 Oct 01 14:38:00 crc kubenswrapper[4851]: I1001 14:38:00.534882 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="85553da2e34a2e31e9d566f2f011387e340834a9e5c5cb46a4d8e62afb7ef26e" exitCode=0 Oct 01 14:38:00 crc kubenswrapper[4851]: I1001 14:38:00.534919 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"85553da2e34a2e31e9d566f2f011387e340834a9e5c5cb46a4d8e62afb7ef26e"} Oct 01 14:38:00 crc kubenswrapper[4851]: I1001 14:38:00.535161 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11"} Oct 01 14:38:00 crc kubenswrapper[4851]: I1001 14:38:00.535186 4851 scope.go:117] "RemoveContainer" containerID="3c199352e988b2e1c9e3c2adb38105050c6800fd3181db42740624065be1c161" Oct 01 14:38:21 crc kubenswrapper[4851]: I1001 14:38:21.696250 4851 scope.go:117] "RemoveContainer" containerID="24cb527df17535b6fa3e2f033621745f2eed1ce472df593a611a03ea04175e60" Oct 01 14:38:58 crc kubenswrapper[4851]: I1001 14:38:58.189477 4851 generic.go:334] "Generic (PLEG): container finished" podID="dfe4a600-52da-4824-8cb1-416ea1ac94dd" containerID="9c569db2c115195f45e439d574aa36a7fabec9d8418ab7a9b880d942bb705fd2" exitCode=0 Oct 01 14:38:58 crc kubenswrapper[4851]: I1001 14:38:58.189974 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6dhv/must-gather-t6htd" event={"ID":"dfe4a600-52da-4824-8cb1-416ea1ac94dd","Type":"ContainerDied","Data":"9c569db2c115195f45e439d574aa36a7fabec9d8418ab7a9b880d942bb705fd2"} Oct 01 14:38:58 crc kubenswrapper[4851]: I1001 14:38:58.190980 4851 scope.go:117] "RemoveContainer" containerID="9c569db2c115195f45e439d574aa36a7fabec9d8418ab7a9b880d942bb705fd2" Oct 01 14:38:58 crc kubenswrapper[4851]: I1001 14:38:58.967670 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t6dhv_must-gather-t6htd_dfe4a600-52da-4824-8cb1-416ea1ac94dd/gather/0.log" Oct 01 14:39:08 crc kubenswrapper[4851]: I1001 14:39:08.404750 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t6dhv/must-gather-t6htd"] Oct 01 14:39:08 crc kubenswrapper[4851]: I1001 14:39:08.405674 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-t6dhv/must-gather-t6htd" podUID="dfe4a600-52da-4824-8cb1-416ea1ac94dd" containerName="copy" containerID="cri-o://bd04686090950d230ccec218de316aaa5002ccfc6e816b6fa8e2fb9f2681ca52" gracePeriod=2 Oct 01 14:39:08 crc kubenswrapper[4851]: I1001 14:39:08.424115 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t6dhv/must-gather-t6htd"] Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.053262 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t6dhv_must-gather-t6htd_dfe4a600-52da-4824-8cb1-416ea1ac94dd/copy/0.log" Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.054079 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/must-gather-t6htd" Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.194885 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6tdc\" (UniqueName: \"kubernetes.io/projected/dfe4a600-52da-4824-8cb1-416ea1ac94dd-kube-api-access-l6tdc\") pod \"dfe4a600-52da-4824-8cb1-416ea1ac94dd\" (UID: \"dfe4a600-52da-4824-8cb1-416ea1ac94dd\") " Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.195227 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfe4a600-52da-4824-8cb1-416ea1ac94dd-must-gather-output\") pod \"dfe4a600-52da-4824-8cb1-416ea1ac94dd\" (UID: \"dfe4a600-52da-4824-8cb1-416ea1ac94dd\") " Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.206493 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfe4a600-52da-4824-8cb1-416ea1ac94dd-kube-api-access-l6tdc" (OuterVolumeSpecName: "kube-api-access-l6tdc") pod "dfe4a600-52da-4824-8cb1-416ea1ac94dd" (UID: "dfe4a600-52da-4824-8cb1-416ea1ac94dd"). InnerVolumeSpecName "kube-api-access-l6tdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.301001 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t6dhv_must-gather-t6htd_dfe4a600-52da-4824-8cb1-416ea1ac94dd/copy/0.log" Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.301777 4851 generic.go:334] "Generic (PLEG): container finished" podID="dfe4a600-52da-4824-8cb1-416ea1ac94dd" containerID="bd04686090950d230ccec218de316aaa5002ccfc6e816b6fa8e2fb9f2681ca52" exitCode=143 Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.301845 4851 scope.go:117] "RemoveContainer" containerID="bd04686090950d230ccec218de316aaa5002ccfc6e816b6fa8e2fb9f2681ca52" Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.301884 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6tdc\" (UniqueName: \"kubernetes.io/projected/dfe4a600-52da-4824-8cb1-416ea1ac94dd-kube-api-access-l6tdc\") on node \"crc\" DevicePath \"\"" Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.302090 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6dhv/must-gather-t6htd" Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.373410 4851 scope.go:117] "RemoveContainer" containerID="9c569db2c115195f45e439d574aa36a7fabec9d8418ab7a9b880d942bb705fd2" Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.417211 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfe4a600-52da-4824-8cb1-416ea1ac94dd-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dfe4a600-52da-4824-8cb1-416ea1ac94dd" (UID: "dfe4a600-52da-4824-8cb1-416ea1ac94dd"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.479557 4851 scope.go:117] "RemoveContainer" containerID="bd04686090950d230ccec218de316aaa5002ccfc6e816b6fa8e2fb9f2681ca52" Oct 01 14:39:09 crc kubenswrapper[4851]: E1001 14:39:09.481357 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd04686090950d230ccec218de316aaa5002ccfc6e816b6fa8e2fb9f2681ca52\": container with ID starting with bd04686090950d230ccec218de316aaa5002ccfc6e816b6fa8e2fb9f2681ca52 not found: ID does not exist" containerID="bd04686090950d230ccec218de316aaa5002ccfc6e816b6fa8e2fb9f2681ca52" Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.481393 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd04686090950d230ccec218de316aaa5002ccfc6e816b6fa8e2fb9f2681ca52"} err="failed to get container status \"bd04686090950d230ccec218de316aaa5002ccfc6e816b6fa8e2fb9f2681ca52\": rpc error: code = NotFound desc = could not find container \"bd04686090950d230ccec218de316aaa5002ccfc6e816b6fa8e2fb9f2681ca52\": container with ID starting with bd04686090950d230ccec218de316aaa5002ccfc6e816b6fa8e2fb9f2681ca52 not found: ID does not exist" Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.481414 4851 scope.go:117] "RemoveContainer" containerID="9c569db2c115195f45e439d574aa36a7fabec9d8418ab7a9b880d942bb705fd2" Oct 01 14:39:09 crc kubenswrapper[4851]: E1001 14:39:09.483677 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c569db2c115195f45e439d574aa36a7fabec9d8418ab7a9b880d942bb705fd2\": container with ID starting with 9c569db2c115195f45e439d574aa36a7fabec9d8418ab7a9b880d942bb705fd2 not found: ID does not exist" containerID="9c569db2c115195f45e439d574aa36a7fabec9d8418ab7a9b880d942bb705fd2" Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.483747 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c569db2c115195f45e439d574aa36a7fabec9d8418ab7a9b880d942bb705fd2"} err="failed to get container status \"9c569db2c115195f45e439d574aa36a7fabec9d8418ab7a9b880d942bb705fd2\": rpc error: code = NotFound desc = could not find container \"9c569db2c115195f45e439d574aa36a7fabec9d8418ab7a9b880d942bb705fd2\": container with ID starting with 9c569db2c115195f45e439d574aa36a7fabec9d8418ab7a9b880d942bb705fd2 not found: ID does not exist" Oct 01 14:39:09 crc kubenswrapper[4851]: I1001 14:39:09.507677 4851 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfe4a600-52da-4824-8cb1-416ea1ac94dd-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 14:39:10 crc kubenswrapper[4851]: I1001 14:39:10.339966 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfe4a600-52da-4824-8cb1-416ea1ac94dd" path="/var/lib/kubelet/pods/dfe4a600-52da-4824-8cb1-416ea1ac94dd/volumes" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.684621 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-95hhg/must-gather-d2kfp"] Oct 01 14:39:50 crc kubenswrapper[4851]: E1001 14:39:50.685687 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe4a600-52da-4824-8cb1-416ea1ac94dd" containerName="copy" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.685705 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe4a600-52da-4824-8cb1-416ea1ac94dd" containerName="copy" Oct 01 14:39:50 crc kubenswrapper[4851]: E1001 14:39:50.685714 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51635154-e9ef-401b-ad5d-bdebe4cda8fc" containerName="container-00" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.685723 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="51635154-e9ef-401b-ad5d-bdebe4cda8fc" containerName="container-00" Oct 01 14:39:50 crc kubenswrapper[4851]: E1001 14:39:50.685746 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383f2e61-3243-4838-82e1-2ef6efa0979d" containerName="registry-server" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.685754 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="383f2e61-3243-4838-82e1-2ef6efa0979d" containerName="registry-server" Oct 01 14:39:50 crc kubenswrapper[4851]: E1001 14:39:50.685770 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383f2e61-3243-4838-82e1-2ef6efa0979d" containerName="extract-utilities" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.685779 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="383f2e61-3243-4838-82e1-2ef6efa0979d" containerName="extract-utilities" Oct 01 14:39:50 crc kubenswrapper[4851]: E1001 14:39:50.685792 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe4a600-52da-4824-8cb1-416ea1ac94dd" containerName="gather" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.685799 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe4a600-52da-4824-8cb1-416ea1ac94dd" containerName="gather" Oct 01 14:39:50 crc kubenswrapper[4851]: E1001 14:39:50.685829 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383f2e61-3243-4838-82e1-2ef6efa0979d" containerName="extract-content" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.685836 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="383f2e61-3243-4838-82e1-2ef6efa0979d" containerName="extract-content" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.686104 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe4a600-52da-4824-8cb1-416ea1ac94dd" containerName="copy" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.686130 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="383f2e61-3243-4838-82e1-2ef6efa0979d" containerName="registry-server" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.686159 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe4a600-52da-4824-8cb1-416ea1ac94dd" containerName="gather" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.686170 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="51635154-e9ef-401b-ad5d-bdebe4cda8fc" containerName="container-00" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.687562 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/must-gather-d2kfp" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.690558 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-95hhg"/"kube-root-ca.crt" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.690927 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-95hhg"/"default-dockercfg-lp4n4" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.691157 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-95hhg"/"openshift-service-ca.crt" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.735982 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-95hhg/must-gather-d2kfp"] Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.863518 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddz9r\" (UniqueName: \"kubernetes.io/projected/480965d4-fec5-4956-9747-11204e8a03ba-kube-api-access-ddz9r\") pod \"must-gather-d2kfp\" (UID: \"480965d4-fec5-4956-9747-11204e8a03ba\") " pod="openshift-must-gather-95hhg/must-gather-d2kfp" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.863622 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/480965d4-fec5-4956-9747-11204e8a03ba-must-gather-output\") pod \"must-gather-d2kfp\" (UID: \"480965d4-fec5-4956-9747-11204e8a03ba\") " pod="openshift-must-gather-95hhg/must-gather-d2kfp" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.966568 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddz9r\" (UniqueName: \"kubernetes.io/projected/480965d4-fec5-4956-9747-11204e8a03ba-kube-api-access-ddz9r\") pod \"must-gather-d2kfp\" (UID: \"480965d4-fec5-4956-9747-11204e8a03ba\") " pod="openshift-must-gather-95hhg/must-gather-d2kfp" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.966686 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/480965d4-fec5-4956-9747-11204e8a03ba-must-gather-output\") pod \"must-gather-d2kfp\" (UID: \"480965d4-fec5-4956-9747-11204e8a03ba\") " pod="openshift-must-gather-95hhg/must-gather-d2kfp" Oct 01 14:39:50 crc kubenswrapper[4851]: I1001 14:39:50.967260 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/480965d4-fec5-4956-9747-11204e8a03ba-must-gather-output\") pod \"must-gather-d2kfp\" (UID: \"480965d4-fec5-4956-9747-11204e8a03ba\") " pod="openshift-must-gather-95hhg/must-gather-d2kfp" Oct 01 14:39:51 crc kubenswrapper[4851]: I1001 14:39:50.988313 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddz9r\" (UniqueName: \"kubernetes.io/projected/480965d4-fec5-4956-9747-11204e8a03ba-kube-api-access-ddz9r\") pod \"must-gather-d2kfp\" (UID: \"480965d4-fec5-4956-9747-11204e8a03ba\") " pod="openshift-must-gather-95hhg/must-gather-d2kfp" Oct 01 14:39:51 crc kubenswrapper[4851]: I1001 14:39:51.015839 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/must-gather-d2kfp" Oct 01 14:39:51 crc kubenswrapper[4851]: I1001 14:39:51.501910 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-95hhg/must-gather-d2kfp"] Oct 01 14:39:51 crc kubenswrapper[4851]: I1001 14:39:51.805599 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-95hhg/must-gather-d2kfp" event={"ID":"480965d4-fec5-4956-9747-11204e8a03ba","Type":"ContainerStarted","Data":"f6338c28d3c07f7b1a0cfa8c65fdea614c532ebbb181411b0a625cf5389f30a1"} Oct 01 14:39:51 crc kubenswrapper[4851]: I1001 14:39:51.805688 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-95hhg/must-gather-d2kfp" event={"ID":"480965d4-fec5-4956-9747-11204e8a03ba","Type":"ContainerStarted","Data":"b1a0c3c1e6b863bc27715b8ccbcca9a4cce000b77f6bd18f83764c8130fe8114"} Oct 01 14:39:52 crc kubenswrapper[4851]: I1001 14:39:52.820285 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-95hhg/must-gather-d2kfp" event={"ID":"480965d4-fec5-4956-9747-11204e8a03ba","Type":"ContainerStarted","Data":"7879b2c565288508fc4be33c71502e51b6fb2410aee9c9bd8c7010a6212bea84"} Oct 01 14:39:52 crc kubenswrapper[4851]: I1001 14:39:52.848416 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-95hhg/must-gather-d2kfp" podStartSLOduration=2.848369601 podStartE2EDuration="2.848369601s" podCreationTimestamp="2025-10-01 14:39:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:39:52.837035358 +0000 UTC m=+6401.182152874" watchObservedRunningTime="2025-10-01 14:39:52.848369601 +0000 UTC m=+6401.193487097" Oct 01 14:39:55 crc kubenswrapper[4851]: I1001 14:39:55.476678 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-95hhg/crc-debug-sg7gg"] Oct 01 14:39:55 crc kubenswrapper[4851]: I1001 14:39:55.478474 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/crc-debug-sg7gg" Oct 01 14:39:55 crc kubenswrapper[4851]: I1001 14:39:55.555869 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eaa405d4-4e1b-4b65-a906-0bd554e274d0-host\") pod \"crc-debug-sg7gg\" (UID: \"eaa405d4-4e1b-4b65-a906-0bd554e274d0\") " pod="openshift-must-gather-95hhg/crc-debug-sg7gg" Oct 01 14:39:55 crc kubenswrapper[4851]: I1001 14:39:55.556274 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxv2z\" (UniqueName: \"kubernetes.io/projected/eaa405d4-4e1b-4b65-a906-0bd554e274d0-kube-api-access-vxv2z\") pod \"crc-debug-sg7gg\" (UID: \"eaa405d4-4e1b-4b65-a906-0bd554e274d0\") " pod="openshift-must-gather-95hhg/crc-debug-sg7gg" Oct 01 14:39:55 crc kubenswrapper[4851]: I1001 14:39:55.658364 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxv2z\" (UniqueName: \"kubernetes.io/projected/eaa405d4-4e1b-4b65-a906-0bd554e274d0-kube-api-access-vxv2z\") pod \"crc-debug-sg7gg\" (UID: \"eaa405d4-4e1b-4b65-a906-0bd554e274d0\") " pod="openshift-must-gather-95hhg/crc-debug-sg7gg" Oct 01 14:39:55 crc kubenswrapper[4851]: I1001 14:39:55.658454 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eaa405d4-4e1b-4b65-a906-0bd554e274d0-host\") pod \"crc-debug-sg7gg\" (UID: \"eaa405d4-4e1b-4b65-a906-0bd554e274d0\") " pod="openshift-must-gather-95hhg/crc-debug-sg7gg" Oct 01 14:39:55 crc kubenswrapper[4851]: I1001 14:39:55.658567 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eaa405d4-4e1b-4b65-a906-0bd554e274d0-host\") pod \"crc-debug-sg7gg\" (UID: \"eaa405d4-4e1b-4b65-a906-0bd554e274d0\") " pod="openshift-must-gather-95hhg/crc-debug-sg7gg" Oct 01 14:39:55 crc kubenswrapper[4851]: I1001 14:39:55.686434 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxv2z\" (UniqueName: \"kubernetes.io/projected/eaa405d4-4e1b-4b65-a906-0bd554e274d0-kube-api-access-vxv2z\") pod \"crc-debug-sg7gg\" (UID: \"eaa405d4-4e1b-4b65-a906-0bd554e274d0\") " pod="openshift-must-gather-95hhg/crc-debug-sg7gg" Oct 01 14:39:55 crc kubenswrapper[4851]: I1001 14:39:55.807593 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/crc-debug-sg7gg" Oct 01 14:39:55 crc kubenswrapper[4851]: W1001 14:39:55.844702 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaa405d4_4e1b_4b65_a906_0bd554e274d0.slice/crio-17311fae5bea6d1bc367153a885c422f2cb3a5d22db296592fef967a4bea99db WatchSource:0}: Error finding container 17311fae5bea6d1bc367153a885c422f2cb3a5d22db296592fef967a4bea99db: Status 404 returned error can't find the container with id 17311fae5bea6d1bc367153a885c422f2cb3a5d22db296592fef967a4bea99db Oct 01 14:39:55 crc kubenswrapper[4851]: I1001 14:39:55.857612 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-95hhg/crc-debug-sg7gg" event={"ID":"eaa405d4-4e1b-4b65-a906-0bd554e274d0","Type":"ContainerStarted","Data":"17311fae5bea6d1bc367153a885c422f2cb3a5d22db296592fef967a4bea99db"} Oct 01 14:39:56 crc kubenswrapper[4851]: I1001 14:39:56.879651 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-95hhg/crc-debug-sg7gg" event={"ID":"eaa405d4-4e1b-4b65-a906-0bd554e274d0","Type":"ContainerStarted","Data":"e1e2eae2a32e596bdac7a5350fbc36c58f075916a902ed14d0adadafde9b1f2f"} Oct 01 14:39:56 crc kubenswrapper[4851]: I1001 14:39:56.901867 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-95hhg/crc-debug-sg7gg" podStartSLOduration=1.9018494289999999 podStartE2EDuration="1.901849429s" podCreationTimestamp="2025-10-01 14:39:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:39:56.891769402 +0000 UTC m=+6405.236886878" watchObservedRunningTime="2025-10-01 14:39:56.901849429 +0000 UTC m=+6405.246966915" Oct 01 14:40:00 crc kubenswrapper[4851]: I1001 14:40:00.050222 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:40:00 crc kubenswrapper[4851]: I1001 14:40:00.050788 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:40:30 crc kubenswrapper[4851]: I1001 14:40:30.050697 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:40:30 crc kubenswrapper[4851]: I1001 14:40:30.051132 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:41:00 crc kubenswrapper[4851]: I1001 14:41:00.050447 4851 patch_prober.go:28] interesting pod/machine-config-daemon-fv72m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:41:00 crc kubenswrapper[4851]: I1001 14:41:00.052308 4851 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:41:00 crc kubenswrapper[4851]: I1001 14:41:00.052967 4851 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" Oct 01 14:41:00 crc kubenswrapper[4851]: I1001 14:41:00.054772 4851 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11"} pod="openshift-machine-config-operator/machine-config-daemon-fv72m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:41:00 crc kubenswrapper[4851]: I1001 14:41:00.055152 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerName="machine-config-daemon" containerID="cri-o://c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" gracePeriod=600 Oct 01 14:41:00 crc kubenswrapper[4851]: E1001 14:41:00.265730 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:41:00 crc kubenswrapper[4851]: I1001 14:41:00.473809 4851 generic.go:334] "Generic (PLEG): container finished" podID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" exitCode=0 Oct 01 14:41:00 crc kubenswrapper[4851]: I1001 14:41:00.473845 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerDied","Data":"c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11"} Oct 01 14:41:00 crc kubenswrapper[4851]: I1001 14:41:00.473918 4851 scope.go:117] "RemoveContainer" containerID="85553da2e34a2e31e9d566f2f011387e340834a9e5c5cb46a4d8e62afb7ef26e" Oct 01 14:41:00 crc kubenswrapper[4851]: I1001 14:41:00.474755 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:41:00 crc kubenswrapper[4851]: E1001 14:41:00.475086 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:41:10 crc kubenswrapper[4851]: I1001 14:41:10.706282 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-79bfcf4f68-4nz4v_62a44312-8062-4c82-ab00-f87600fa8f93/barbican-api-log/0.log" Oct 01 14:41:10 crc kubenswrapper[4851]: I1001 14:41:10.735699 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-79bfcf4f68-4nz4v_62a44312-8062-4c82-ab00-f87600fa8f93/barbican-api/0.log" Oct 01 14:41:10 crc kubenswrapper[4851]: I1001 14:41:10.910205 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f7d86b4d6-9685z_83d01de2-5036-447d-9bcc-adae38fc5202/barbican-keystone-listener/0.log" Oct 01 14:41:11 crc kubenswrapper[4851]: I1001 14:41:11.034418 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f7d86b4d6-9685z_83d01de2-5036-447d-9bcc-adae38fc5202/barbican-keystone-listener-log/0.log" Oct 01 14:41:11 crc kubenswrapper[4851]: I1001 14:41:11.141004 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6c49b89697-kzlf9_4fb21634-7084-49a9-88d7-8759e5c794cb/barbican-worker/0.log" Oct 01 14:41:11 crc kubenswrapper[4851]: I1001 14:41:11.214740 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6c49b89697-kzlf9_4fb21634-7084-49a9-88d7-8759e5c794cb/barbican-worker-log/0.log" Oct 01 14:41:11 crc kubenswrapper[4851]: I1001 14:41:11.316657 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-cswrf_db4056ce-42b4-4853-9e9f-69320e29e5cc/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:11 crc kubenswrapper[4851]: I1001 14:41:11.759977 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5b04518a-1699-4ef1-8b54-57c7343e081c/ceilometer-central-agent/0.log" Oct 01 14:41:11 crc kubenswrapper[4851]: I1001 14:41:11.773442 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5b04518a-1699-4ef1-8b54-57c7343e081c/ceilometer-notification-agent/0.log" Oct 01 14:41:11 crc kubenswrapper[4851]: I1001 14:41:11.786511 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5b04518a-1699-4ef1-8b54-57c7343e081c/proxy-httpd/0.log" Oct 01 14:41:11 crc kubenswrapper[4851]: I1001 14:41:11.920622 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5b04518a-1699-4ef1-8b54-57c7343e081c/sg-core/0.log" Oct 01 14:41:12 crc kubenswrapper[4851]: I1001 14:41:12.147322 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_698a82d3-c7a7-4b4b-8cf9-46f6589744d9/cinder-api-log/0.log" Oct 01 14:41:12 crc kubenswrapper[4851]: I1001 14:41:12.322036 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_698a82d3-c7a7-4b4b-8cf9-46f6589744d9/cinder-api/0.log" Oct 01 14:41:12 crc kubenswrapper[4851]: I1001 14:41:12.418059 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9/cinder-scheduler/0.log" Oct 01 14:41:12 crc kubenswrapper[4851]: I1001 14:41:12.542823 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dcbf27cf-3d1e-48c8-89f2-ac85568c6ae9/probe/0.log" Oct 01 14:41:12 crc kubenswrapper[4851]: I1001 14:41:12.604819 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-5tcdw_4e073c07-f76e-424b-a1d1-68fcabf7f063/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:12 crc kubenswrapper[4851]: I1001 14:41:12.758663 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-khbc7_7966e9d5-430c-417e-9ba2-b53c598831e7/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:12 crc kubenswrapper[4851]: I1001 14:41:12.986723 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mjkdd_0168fd9f-0f7b-432d-a09d-927ac34e34b3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:13 crc kubenswrapper[4851]: I1001 14:41:13.025527 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-659b5fb8c5-5wqbr_73149e01-4273-4703-a6bd-0b44c3ce5aad/init/0.log" Oct 01 14:41:13 crc kubenswrapper[4851]: I1001 14:41:13.276932 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-659b5fb8c5-5wqbr_73149e01-4273-4703-a6bd-0b44c3ce5aad/init/0.log" Oct 01 14:41:13 crc kubenswrapper[4851]: I1001 14:41:13.419075 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-659b5fb8c5-5wqbr_73149e01-4273-4703-a6bd-0b44c3ce5aad/dnsmasq-dns/0.log" Oct 01 14:41:13 crc kubenswrapper[4851]: I1001 14:41:13.492485 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pp46c_6d597e1f-4712-4c7f-8f88-c2d3d9e09e5b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:13 crc kubenswrapper[4851]: I1001 14:41:13.656353 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c/glance-httpd/0.log" Oct 01 14:41:13 crc kubenswrapper[4851]: I1001 14:41:13.721963 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ff6e380f-fa95-4b19-b3ed-26f7c9a8f47c/glance-log/0.log" Oct 01 14:41:13 crc kubenswrapper[4851]: I1001 14:41:13.856672 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0/glance-httpd/0.log" Oct 01 14:41:13 crc kubenswrapper[4851]: I1001 14:41:13.890106 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_41eb8875-7a56-4dcc-9400-ad6bbe2dc7a0/glance-log/0.log" Oct 01 14:41:14 crc kubenswrapper[4851]: I1001 14:41:14.050680 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-58c9859d68-bckn5_2fcf93f8-06db-4cab-8699-9051ca2ae50a/horizon/0.log" Oct 01 14:41:14 crc kubenswrapper[4851]: I1001 14:41:14.226871 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-b7rcn_dd4ecbfe-8a57-43e0-9ef9-bdbdd00053e0/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:14 crc kubenswrapper[4851]: I1001 14:41:14.527114 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-swrg6_b850e5c8-7c3d-4ae2-ab48-a17e80c41091/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:14 crc kubenswrapper[4851]: I1001 14:41:14.737394 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29322121-sgptb_22136cf3-22ce-4ebc-b5b6-0b61f25844e8/keystone-cron/0.log" Oct 01 14:41:14 crc kubenswrapper[4851]: I1001 14:41:14.847527 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-58c9859d68-bckn5_2fcf93f8-06db-4cab-8699-9051ca2ae50a/horizon-log/0.log" Oct 01 14:41:14 crc kubenswrapper[4851]: I1001 14:41:14.931744 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_3ce21dde-21c2-49d7-aac0-cba896d9b1de/kube-state-metrics/0.log" Oct 01 14:41:15 crc kubenswrapper[4851]: I1001 14:41:15.134325 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-78d78d4545-tv25n_bdf629fa-5ac6-4985-a03a-c77c19cc9adb/keystone-api/0.log" Oct 01 14:41:15 crc kubenswrapper[4851]: I1001 14:41:15.184089 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9s94l_18d67c2a-547a-448f-8492-c4c997cc938e/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:15 crc kubenswrapper[4851]: I1001 14:41:15.328752 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:41:15 crc kubenswrapper[4851]: E1001 14:41:15.329259 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:41:15 crc kubenswrapper[4851]: I1001 14:41:15.724892 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57698c5d89-m6vxz_0f07dc57-fd61-4799-8658-3ed1fcc9f01c/neutron-api/0.log" Oct 01 14:41:15 crc kubenswrapper[4851]: I1001 14:41:15.743884 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57698c5d89-m6vxz_0f07dc57-fd61-4799-8658-3ed1fcc9f01c/neutron-httpd/0.log" Oct 01 14:41:15 crc kubenswrapper[4851]: I1001 14:41:15.841320 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-7v94v_f1395fd2-649e-42f8-b320-5d81ae321978/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:16 crc kubenswrapper[4851]: I1001 14:41:16.755372 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_1e7d4479-701c-47c9-92b6-31ea543b479f/nova-cell0-conductor-conductor/0.log" Oct 01 14:41:17 crc kubenswrapper[4851]: I1001 14:41:17.422786 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9e2cacf3-b4a4-4839-a723-4bd4578935a7/nova-cell1-conductor-conductor/0.log" Oct 01 14:41:17 crc kubenswrapper[4851]: I1001 14:41:17.709041 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_88f8fcd9-1105-4592-a420-98ea9033c3d9/nova-api-log/0.log" Oct 01 14:41:18 crc kubenswrapper[4851]: I1001 14:41:18.059662 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_88f8fcd9-1105-4592-a420-98ea9033c3d9/nova-api-api/0.log" Oct 01 14:41:18 crc kubenswrapper[4851]: I1001 14:41:18.102410 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_1ab5594b-8bc0-4c89-b570-604b54931bbe/nova-cell1-novncproxy-novncproxy/0.log" Oct 01 14:41:18 crc kubenswrapper[4851]: I1001 14:41:18.228352 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-5sjkx_75a874c6-cd94-467a-ab74-bede44646604/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:18 crc kubenswrapper[4851]: I1001 14:41:18.384807 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522/nova-metadata-log/0.log" Oct 01 14:41:18 crc kubenswrapper[4851]: I1001 14:41:18.995866 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d829ebf1-e5aa-4c23-9a7e-db128f394557/nova-scheduler-scheduler/0.log" Oct 01 14:41:19 crc kubenswrapper[4851]: I1001 14:41:19.064429 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_311e8f50-9e4a-4a03-bc24-04b76d53a238/mysql-bootstrap/0.log" Oct 01 14:41:19 crc kubenswrapper[4851]: I1001 14:41:19.195006 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_311e8f50-9e4a-4a03-bc24-04b76d53a238/mysql-bootstrap/0.log" Oct 01 14:41:19 crc kubenswrapper[4851]: I1001 14:41:19.283780 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_311e8f50-9e4a-4a03-bc24-04b76d53a238/galera/0.log" Oct 01 14:41:19 crc kubenswrapper[4851]: I1001 14:41:19.486857 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7c2ac1cc-c49c-4966-a357-2d1ba04d5671/mysql-bootstrap/0.log" Oct 01 14:41:19 crc kubenswrapper[4851]: I1001 14:41:19.680901 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7c2ac1cc-c49c-4966-a357-2d1ba04d5671/galera/0.log" Oct 01 14:41:19 crc kubenswrapper[4851]: I1001 14:41:19.700258 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7c2ac1cc-c49c-4966-a357-2d1ba04d5671/mysql-bootstrap/0.log" Oct 01 14:41:19 crc kubenswrapper[4851]: I1001 14:41:19.920675 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3c78b195-7b7f-4aa7-9d40-f0a36ab0f32f/openstackclient/0.log" Oct 01 14:41:20 crc kubenswrapper[4851]: I1001 14:41:20.179246 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-dmq5k_7b15e5b7-162e-46ee-a292-bf763704cda6/ovn-controller/0.log" Oct 01 14:41:20 crc kubenswrapper[4851]: I1001 14:41:20.368834 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8sm5d_236d88b9-092a-448d-a793-b133f7abe5f9/openstack-network-exporter/0.log" Oct 01 14:41:20 crc kubenswrapper[4851]: I1001 14:41:20.633211 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gt7fw_928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22/ovsdb-server-init/0.log" Oct 01 14:41:20 crc kubenswrapper[4851]: I1001 14:41:20.847673 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gt7fw_928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22/ovsdb-server-init/0.log" Oct 01 14:41:21 crc kubenswrapper[4851]: I1001 14:41:21.027108 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gt7fw_928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22/ovsdb-server/0.log" Oct 01 14:41:21 crc kubenswrapper[4851]: I1001 14:41:21.243592 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f4a38ade-2f8c-4e58-9ee1-ad4d9ef4d522/nova-metadata-metadata/0.log" Oct 01 14:41:21 crc kubenswrapper[4851]: I1001 14:41:21.248374 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gt7fw_928b6ee6-bba0-4da7-a3d5-80b9bcfdfb22/ovs-vswitchd/0.log" Oct 01 14:41:21 crc kubenswrapper[4851]: I1001 14:41:21.447078 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bkddc_d579751c-546b-4a48-8ae6-f9753609107a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:21 crc kubenswrapper[4851]: I1001 14:41:21.494143 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_59b6f03e-998a-46c8-a0d7-ce0cbad79b3b/openstack-network-exporter/0.log" Oct 01 14:41:21 crc kubenswrapper[4851]: I1001 14:41:21.637444 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_59b6f03e-998a-46c8-a0d7-ce0cbad79b3b/ovn-northd/0.log" Oct 01 14:41:21 crc kubenswrapper[4851]: I1001 14:41:21.723351 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f/openstack-network-exporter/0.log" Oct 01 14:41:21 crc kubenswrapper[4851]: I1001 14:41:21.832071 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c3e4f5ad-f2e6-4a32-a900-bba6cc4ac88f/ovsdbserver-nb/0.log" Oct 01 14:41:21 crc kubenswrapper[4851]: I1001 14:41:21.840863 4851 scope.go:117] "RemoveContainer" containerID="7c16113daaf420bf0870b43b6439fa4c8f41ae1e48a5ff3ee8761bed60da3f09" Oct 01 14:41:21 crc kubenswrapper[4851]: I1001 14:41:21.973850 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3739abde-3ff0-4c31-aeb6-f731eb37dfac/openstack-network-exporter/0.log" Oct 01 14:41:22 crc kubenswrapper[4851]: I1001 14:41:22.032829 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3739abde-3ff0-4c31-aeb6-f731eb37dfac/ovsdbserver-sb/0.log" Oct 01 14:41:22 crc kubenswrapper[4851]: I1001 14:41:22.473661 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7c6fb58db4-9tzr2_2845afcf-be46-4aad-a15a-79c3a54b844c/placement-api/0.log" Oct 01 14:41:22 crc kubenswrapper[4851]: I1001 14:41:22.513008 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7c6fb58db4-9tzr2_2845afcf-be46-4aad-a15a-79c3a54b844c/placement-log/0.log" Oct 01 14:41:22 crc kubenswrapper[4851]: I1001 14:41:22.624104 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5c7e1ec-7093-4fa6-acf7-39a2839cfb11/init-config-reloader/0.log" Oct 01 14:41:22 crc kubenswrapper[4851]: I1001 14:41:22.809825 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5c7e1ec-7093-4fa6-acf7-39a2839cfb11/prometheus/0.log" Oct 01 14:41:22 crc kubenswrapper[4851]: I1001 14:41:22.830545 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5c7e1ec-7093-4fa6-acf7-39a2839cfb11/init-config-reloader/0.log" Oct 01 14:41:22 crc kubenswrapper[4851]: I1001 14:41:22.830960 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5c7e1ec-7093-4fa6-acf7-39a2839cfb11/config-reloader/0.log" Oct 01 14:41:23 crc kubenswrapper[4851]: I1001 14:41:23.016984 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5c7e1ec-7093-4fa6-acf7-39a2839cfb11/thanos-sidecar/0.log" Oct 01 14:41:23 crc kubenswrapper[4851]: I1001 14:41:23.033152 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3dc04e94-f66a-4937-86d2-24def7247794/setup-container/0.log" Oct 01 14:41:23 crc kubenswrapper[4851]: I1001 14:41:23.264377 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3dc04e94-f66a-4937-86d2-24def7247794/setup-container/0.log" Oct 01 14:41:23 crc kubenswrapper[4851]: I1001 14:41:23.282052 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3dc04e94-f66a-4937-86d2-24def7247794/rabbitmq/0.log" Oct 01 14:41:23 crc kubenswrapper[4851]: I1001 14:41:23.457275 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb/setup-container/0.log" Oct 01 14:41:23 crc kubenswrapper[4851]: I1001 14:41:23.665570 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb/rabbitmq/0.log" Oct 01 14:41:23 crc kubenswrapper[4851]: I1001 14:41:23.675604 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_8cae03b9-aeea-4fc7-90b0-7c0aad42f8fb/setup-container/0.log" Oct 01 14:41:23 crc kubenswrapper[4851]: I1001 14:41:23.830700 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_73f3b1c2-f1c0-47b8-bf31-4d2a185c852e/setup-container/0.log" Oct 01 14:41:24 crc kubenswrapper[4851]: I1001 14:41:24.035363 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_73f3b1c2-f1c0-47b8-bf31-4d2a185c852e/setup-container/0.log" Oct 01 14:41:24 crc kubenswrapper[4851]: I1001 14:41:24.096477 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_73f3b1c2-f1c0-47b8-bf31-4d2a185c852e/rabbitmq/0.log" Oct 01 14:41:24 crc kubenswrapper[4851]: I1001 14:41:24.219882 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-b4s6h_6946a3f7-315a-48df-8a02-af0fee0d1fce/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:24 crc kubenswrapper[4851]: I1001 14:41:24.332965 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-c2qg6_c97f93db-5638-4aa6-b20d-98b1602301af/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:24 crc kubenswrapper[4851]: I1001 14:41:24.533939 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-gsf47_9eaff7ea-f90a-4cfa-8850-cf308591ca11/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:24 crc kubenswrapper[4851]: I1001 14:41:24.629413 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-qntf8_75fea0b9-3dcb-4301-835b-346cfe0d09d7/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:24 crc kubenswrapper[4851]: I1001 14:41:24.825949 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-2cbdw_f4295079-cbbd-4349-b20b-ce68008d76e9/ssh-known-hosts-edpm-deployment/0.log" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.032709 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b55c56cb7-m2hqv_39d289ff-07b0-479b-8f7a-bb1e5c108be2/proxy-server/0.log" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.248248 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ctv4l"] Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.251726 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.257395 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctv4l"] Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.282828 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-s5dg8_b0a8dd5b-b066-4203-b283-3ff979e8da98/swift-ring-rebalance/0.log" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.329658 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b55c56cb7-m2hqv_39d289ff-07b0-479b-8f7a-bb1e5c108be2/proxy-httpd/0.log" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.415962 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2fws\" (UniqueName: \"kubernetes.io/projected/5526c286-f1f7-4e15-a94f-d064e9e4303d-kube-api-access-f2fws\") pod \"redhat-marketplace-ctv4l\" (UID: \"5526c286-f1f7-4e15-a94f-d064e9e4303d\") " pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.416120 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5526c286-f1f7-4e15-a94f-d064e9e4303d-catalog-content\") pod \"redhat-marketplace-ctv4l\" (UID: \"5526c286-f1f7-4e15-a94f-d064e9e4303d\") " pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.416198 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5526c286-f1f7-4e15-a94f-d064e9e4303d-utilities\") pod \"redhat-marketplace-ctv4l\" (UID: \"5526c286-f1f7-4e15-a94f-d064e9e4303d\") " pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.517293 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5526c286-f1f7-4e15-a94f-d064e9e4303d-catalog-content\") pod \"redhat-marketplace-ctv4l\" (UID: \"5526c286-f1f7-4e15-a94f-d064e9e4303d\") " pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.517694 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5526c286-f1f7-4e15-a94f-d064e9e4303d-utilities\") pod \"redhat-marketplace-ctv4l\" (UID: \"5526c286-f1f7-4e15-a94f-d064e9e4303d\") " pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.517821 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2fws\" (UniqueName: \"kubernetes.io/projected/5526c286-f1f7-4e15-a94f-d064e9e4303d-kube-api-access-f2fws\") pod \"redhat-marketplace-ctv4l\" (UID: \"5526c286-f1f7-4e15-a94f-d064e9e4303d\") " pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.518636 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5526c286-f1f7-4e15-a94f-d064e9e4303d-catalog-content\") pod \"redhat-marketplace-ctv4l\" (UID: \"5526c286-f1f7-4e15-a94f-d064e9e4303d\") " pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.518966 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5526c286-f1f7-4e15-a94f-d064e9e4303d-utilities\") pod \"redhat-marketplace-ctv4l\" (UID: \"5526c286-f1f7-4e15-a94f-d064e9e4303d\") " pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.552723 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2fws\" (UniqueName: \"kubernetes.io/projected/5526c286-f1f7-4e15-a94f-d064e9e4303d-kube-api-access-f2fws\") pod \"redhat-marketplace-ctv4l\" (UID: \"5526c286-f1f7-4e15-a94f-d064e9e4303d\") " pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.564021 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/account-auditor/0.log" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.574430 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.623913 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/account-reaper/0.log" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.812863 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/account-replicator/0.log" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.902927 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/account-server/0.log" Oct 01 14:41:25 crc kubenswrapper[4851]: I1001 14:41:25.903304 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/container-auditor/0.log" Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.065889 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctv4l"] Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.100607 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/container-replicator/0.log" Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.203808 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/container-updater/0.log" Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.223634 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/container-server/0.log" Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.328114 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:41:26 crc kubenswrapper[4851]: E1001 14:41:26.328780 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.373031 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/object-auditor/0.log" Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.447315 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/object-expirer/0.log" Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.452913 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/object-replicator/0.log" Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.559241 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/object-server/0.log" Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.665066 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/object-updater/0.log" Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.693743 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/rsync/0.log" Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.751467 4851 generic.go:334] "Generic (PLEG): container finished" podID="5526c286-f1f7-4e15-a94f-d064e9e4303d" containerID="32498a1176b4236a138e7541703bae72ae0910107978799147d3aafe4d417d3d" exitCode=0 Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.751751 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctv4l" event={"ID":"5526c286-f1f7-4e15-a94f-d064e9e4303d","Type":"ContainerDied","Data":"32498a1176b4236a138e7541703bae72ae0910107978799147d3aafe4d417d3d"} Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.751852 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctv4l" event={"ID":"5526c286-f1f7-4e15-a94f-d064e9e4303d","Type":"ContainerStarted","Data":"694c3c867d562c49eed821348a6957a6f7b0b1ec51c4e2c7dbf538f77821ef4f"} Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.756298 4851 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:41:26 crc kubenswrapper[4851]: I1001 14:41:26.787674 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6a4d7762-af01-4c7e-9641-4b2054a8885d/swift-recon-cron/0.log" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.000093 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-kznqk_bd7a7617-48e5-42a8-9630-ce17e87cde69/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.166307 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_996ff379-292e-4a71-a09b-164fc21abe76/tempest-tests-tempest-tests-runner/0.log" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.247769 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4b08465d-adad-4a72-b5bb-a50af717f8f6/test-operator-logs-container/0.log" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.441325 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mj2q6"] Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.443445 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.467550 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mj2q6"] Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.508412 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-5dcx6_8457e33a-c243-4a18-80f6-8a1777d60054/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.561687 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ec2f53-130c-4bba-9d5a-20e873b75879-catalog-content\") pod \"community-operators-mj2q6\" (UID: \"c9ec2f53-130c-4bba-9d5a-20e873b75879\") " pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.561738 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ec2f53-130c-4bba-9d5a-20e873b75879-utilities\") pod \"community-operators-mj2q6\" (UID: \"c9ec2f53-130c-4bba-9d5a-20e873b75879\") " pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.561835 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zddg6\" (UniqueName: \"kubernetes.io/projected/c9ec2f53-130c-4bba-9d5a-20e873b75879-kube-api-access-zddg6\") pod \"community-operators-mj2q6\" (UID: \"c9ec2f53-130c-4bba-9d5a-20e873b75879\") " pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.668741 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zddg6\" (UniqueName: \"kubernetes.io/projected/c9ec2f53-130c-4bba-9d5a-20e873b75879-kube-api-access-zddg6\") pod \"community-operators-mj2q6\" (UID: \"c9ec2f53-130c-4bba-9d5a-20e873b75879\") " pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.670286 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ec2f53-130c-4bba-9d5a-20e873b75879-catalog-content\") pod \"community-operators-mj2q6\" (UID: \"c9ec2f53-130c-4bba-9d5a-20e873b75879\") " pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.670346 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ec2f53-130c-4bba-9d5a-20e873b75879-utilities\") pod \"community-operators-mj2q6\" (UID: \"c9ec2f53-130c-4bba-9d5a-20e873b75879\") " pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.671120 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ec2f53-130c-4bba-9d5a-20e873b75879-utilities\") pod \"community-operators-mj2q6\" (UID: \"c9ec2f53-130c-4bba-9d5a-20e873b75879\") " pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.671780 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ec2f53-130c-4bba-9d5a-20e873b75879-catalog-content\") pod \"community-operators-mj2q6\" (UID: \"c9ec2f53-130c-4bba-9d5a-20e873b75879\") " pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.698302 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zddg6\" (UniqueName: \"kubernetes.io/projected/c9ec2f53-130c-4bba-9d5a-20e873b75879-kube-api-access-zddg6\") pod \"community-operators-mj2q6\" (UID: \"c9ec2f53-130c-4bba-9d5a-20e873b75879\") " pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.776794 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:27 crc kubenswrapper[4851]: I1001 14:41:27.918004 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_705d247f-3afa-49f5-ba1d-ab991af3e399/memcached/0.log" Oct 01 14:41:28 crc kubenswrapper[4851]: I1001 14:41:28.405463 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mj2q6"] Oct 01 14:41:28 crc kubenswrapper[4851]: I1001 14:41:28.774664 4851 generic.go:334] "Generic (PLEG): container finished" podID="c9ec2f53-130c-4bba-9d5a-20e873b75879" containerID="b74c49c9426e04e2820c74388f6904713b3f6d1bece22c638eb809d471c2dfab" exitCode=0 Oct 01 14:41:28 crc kubenswrapper[4851]: I1001 14:41:28.775222 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mj2q6" event={"ID":"c9ec2f53-130c-4bba-9d5a-20e873b75879","Type":"ContainerDied","Data":"b74c49c9426e04e2820c74388f6904713b3f6d1bece22c638eb809d471c2dfab"} Oct 01 14:41:28 crc kubenswrapper[4851]: I1001 14:41:28.775323 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mj2q6" event={"ID":"c9ec2f53-130c-4bba-9d5a-20e873b75879","Type":"ContainerStarted","Data":"18dbdfc06d85939f06dde3978633f79d41274b931bb3dc891e5a960a13021379"} Oct 01 14:41:28 crc kubenswrapper[4851]: I1001 14:41:28.791818 4851 generic.go:334] "Generic (PLEG): container finished" podID="5526c286-f1f7-4e15-a94f-d064e9e4303d" containerID="4d86574d01c454742c8d87cca83ec371e5057060c4454f5ffe1f020364a85257" exitCode=0 Oct 01 14:41:28 crc kubenswrapper[4851]: I1001 14:41:28.791859 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctv4l" event={"ID":"5526c286-f1f7-4e15-a94f-d064e9e4303d","Type":"ContainerDied","Data":"4d86574d01c454742c8d87cca83ec371e5057060c4454f5ffe1f020364a85257"} Oct 01 14:41:28 crc kubenswrapper[4851]: I1001 14:41:28.953823 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_bb173f87-ff1f-4a9f-9de8-5073545e2697/watcher-api-log/0.log" Oct 01 14:41:29 crc kubenswrapper[4851]: I1001 14:41:29.059977 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_cad65438-644c-45b7-8267-370126fe6aef/watcher-applier/0.log" Oct 01 14:41:29 crc kubenswrapper[4851]: I1001 14:41:29.168763 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_971ca0ac-6de7-42f1-bf29-5174fd80ced4/watcher-decision-engine/3.log" Oct 01 14:41:29 crc kubenswrapper[4851]: I1001 14:41:29.809790 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctv4l" event={"ID":"5526c286-f1f7-4e15-a94f-d064e9e4303d","Type":"ContainerStarted","Data":"a86aaa2991a889f8d9fb601d97fb4031969ebaed2cd70e9e9f2fee17b61c2b94"} Oct 01 14:41:29 crc kubenswrapper[4851]: I1001 14:41:29.813443 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mj2q6" event={"ID":"c9ec2f53-130c-4bba-9d5a-20e873b75879","Type":"ContainerStarted","Data":"b43ea5a3342130c3fae85f91d9e95135a70c0fabf4043c82dc2f55f2adfcfc89"} Oct 01 14:41:29 crc kubenswrapper[4851]: I1001 14:41:29.830970 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ctv4l" podStartSLOduration=2.332683008 podStartE2EDuration="4.8309498s" podCreationTimestamp="2025-10-01 14:41:25 +0000 UTC" firstStartedPulling="2025-10-01 14:41:26.75608953 +0000 UTC m=+6495.101207016" lastFinishedPulling="2025-10-01 14:41:29.254356322 +0000 UTC m=+6497.599473808" observedRunningTime="2025-10-01 14:41:29.825946497 +0000 UTC m=+6498.171063983" watchObservedRunningTime="2025-10-01 14:41:29.8309498 +0000 UTC m=+6498.176067306" Oct 01 14:41:30 crc kubenswrapper[4851]: I1001 14:41:30.824449 4851 generic.go:334] "Generic (PLEG): container finished" podID="c9ec2f53-130c-4bba-9d5a-20e873b75879" containerID="b43ea5a3342130c3fae85f91d9e95135a70c0fabf4043c82dc2f55f2adfcfc89" exitCode=0 Oct 01 14:41:30 crc kubenswrapper[4851]: I1001 14:41:30.824558 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mj2q6" event={"ID":"c9ec2f53-130c-4bba-9d5a-20e873b75879","Type":"ContainerDied","Data":"b43ea5a3342130c3fae85f91d9e95135a70c0fabf4043c82dc2f55f2adfcfc89"} Oct 01 14:41:31 crc kubenswrapper[4851]: I1001 14:41:31.849801 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mj2q6" event={"ID":"c9ec2f53-130c-4bba-9d5a-20e873b75879","Type":"ContainerStarted","Data":"b7876e2af514e52d4e75e83806b08b9e6753f06510c13a869214ca17c7301a37"} Oct 01 14:41:31 crc kubenswrapper[4851]: I1001 14:41:31.878089 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mj2q6" podStartSLOduration=2.299385675 podStartE2EDuration="4.87807179s" podCreationTimestamp="2025-10-01 14:41:27 +0000 UTC" firstStartedPulling="2025-10-01 14:41:28.782767227 +0000 UTC m=+6497.127884713" lastFinishedPulling="2025-10-01 14:41:31.361453342 +0000 UTC m=+6499.706570828" observedRunningTime="2025-10-01 14:41:31.870412391 +0000 UTC m=+6500.215529887" watchObservedRunningTime="2025-10-01 14:41:31.87807179 +0000 UTC m=+6500.223189276" Oct 01 14:41:32 crc kubenswrapper[4851]: I1001 14:41:32.020084 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_971ca0ac-6de7-42f1-bf29-5174fd80ced4/watcher-decision-engine/4.log" Oct 01 14:41:32 crc kubenswrapper[4851]: I1001 14:41:32.897027 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_bb173f87-ff1f-4a9f-9de8-5073545e2697/watcher-api/0.log" Oct 01 14:41:35 crc kubenswrapper[4851]: I1001 14:41:35.575317 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:35 crc kubenswrapper[4851]: I1001 14:41:35.576683 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:35 crc kubenswrapper[4851]: I1001 14:41:35.630991 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:35 crc kubenswrapper[4851]: I1001 14:41:35.938224 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:36 crc kubenswrapper[4851]: I1001 14:41:36.033286 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctv4l"] Oct 01 14:41:37 crc kubenswrapper[4851]: I1001 14:41:37.777781 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:37 crc kubenswrapper[4851]: I1001 14:41:37.778061 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:37 crc kubenswrapper[4851]: I1001 14:41:37.842326 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:37 crc kubenswrapper[4851]: I1001 14:41:37.901621 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ctv4l" podUID="5526c286-f1f7-4e15-a94f-d064e9e4303d" containerName="registry-server" containerID="cri-o://a86aaa2991a889f8d9fb601d97fb4031969ebaed2cd70e9e9f2fee17b61c2b94" gracePeriod=2 Oct 01 14:41:37 crc kubenswrapper[4851]: I1001 14:41:37.955773 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:38 crc kubenswrapper[4851]: I1001 14:41:38.329668 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:41:38 crc kubenswrapper[4851]: E1001 14:41:38.330135 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:41:38 crc kubenswrapper[4851]: I1001 14:41:38.432725 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mj2q6"] Oct 01 14:41:38 crc kubenswrapper[4851]: I1001 14:41:38.912297 4851 generic.go:334] "Generic (PLEG): container finished" podID="5526c286-f1f7-4e15-a94f-d064e9e4303d" containerID="a86aaa2991a889f8d9fb601d97fb4031969ebaed2cd70e9e9f2fee17b61c2b94" exitCode=0 Oct 01 14:41:38 crc kubenswrapper[4851]: I1001 14:41:38.912558 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctv4l" event={"ID":"5526c286-f1f7-4e15-a94f-d064e9e4303d","Type":"ContainerDied","Data":"a86aaa2991a889f8d9fb601d97fb4031969ebaed2cd70e9e9f2fee17b61c2b94"} Oct 01 14:41:38 crc kubenswrapper[4851]: I1001 14:41:38.912661 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctv4l" event={"ID":"5526c286-f1f7-4e15-a94f-d064e9e4303d","Type":"ContainerDied","Data":"694c3c867d562c49eed821348a6957a6f7b0b1ec51c4e2c7dbf538f77821ef4f"} Oct 01 14:41:38 crc kubenswrapper[4851]: I1001 14:41:38.912676 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="694c3c867d562c49eed821348a6957a6f7b0b1ec51c4e2c7dbf538f77821ef4f" Oct 01 14:41:38 crc kubenswrapper[4851]: I1001 14:41:38.914674 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:39 crc kubenswrapper[4851]: I1001 14:41:39.017645 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5526c286-f1f7-4e15-a94f-d064e9e4303d-utilities\") pod \"5526c286-f1f7-4e15-a94f-d064e9e4303d\" (UID: \"5526c286-f1f7-4e15-a94f-d064e9e4303d\") " Oct 01 14:41:39 crc kubenswrapper[4851]: I1001 14:41:39.017843 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5526c286-f1f7-4e15-a94f-d064e9e4303d-catalog-content\") pod \"5526c286-f1f7-4e15-a94f-d064e9e4303d\" (UID: \"5526c286-f1f7-4e15-a94f-d064e9e4303d\") " Oct 01 14:41:39 crc kubenswrapper[4851]: I1001 14:41:39.017941 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2fws\" (UniqueName: \"kubernetes.io/projected/5526c286-f1f7-4e15-a94f-d064e9e4303d-kube-api-access-f2fws\") pod \"5526c286-f1f7-4e15-a94f-d064e9e4303d\" (UID: \"5526c286-f1f7-4e15-a94f-d064e9e4303d\") " Oct 01 14:41:39 crc kubenswrapper[4851]: I1001 14:41:39.018588 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5526c286-f1f7-4e15-a94f-d064e9e4303d-utilities" (OuterVolumeSpecName: "utilities") pod "5526c286-f1f7-4e15-a94f-d064e9e4303d" (UID: "5526c286-f1f7-4e15-a94f-d064e9e4303d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:41:39 crc kubenswrapper[4851]: I1001 14:41:39.027219 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5526c286-f1f7-4e15-a94f-d064e9e4303d-kube-api-access-f2fws" (OuterVolumeSpecName: "kube-api-access-f2fws") pod "5526c286-f1f7-4e15-a94f-d064e9e4303d" (UID: "5526c286-f1f7-4e15-a94f-d064e9e4303d"). InnerVolumeSpecName "kube-api-access-f2fws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:41:39 crc kubenswrapper[4851]: I1001 14:41:39.034640 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5526c286-f1f7-4e15-a94f-d064e9e4303d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5526c286-f1f7-4e15-a94f-d064e9e4303d" (UID: "5526c286-f1f7-4e15-a94f-d064e9e4303d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:41:39 crc kubenswrapper[4851]: I1001 14:41:39.120696 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2fws\" (UniqueName: \"kubernetes.io/projected/5526c286-f1f7-4e15-a94f-d064e9e4303d-kube-api-access-f2fws\") on node \"crc\" DevicePath \"\"" Oct 01 14:41:39 crc kubenswrapper[4851]: I1001 14:41:39.120734 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5526c286-f1f7-4e15-a94f-d064e9e4303d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:41:39 crc kubenswrapper[4851]: I1001 14:41:39.120743 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5526c286-f1f7-4e15-a94f-d064e9e4303d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:41:39 crc kubenswrapper[4851]: I1001 14:41:39.923050 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ctv4l" Oct 01 14:41:39 crc kubenswrapper[4851]: I1001 14:41:39.923132 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mj2q6" podUID="c9ec2f53-130c-4bba-9d5a-20e873b75879" containerName="registry-server" containerID="cri-o://b7876e2af514e52d4e75e83806b08b9e6753f06510c13a869214ca17c7301a37" gracePeriod=2 Oct 01 14:41:39 crc kubenswrapper[4851]: I1001 14:41:39.964072 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctv4l"] Oct 01 14:41:39 crc kubenswrapper[4851]: I1001 14:41:39.975246 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctv4l"] Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.340618 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5526c286-f1f7-4e15-a94f-d064e9e4303d" path="/var/lib/kubelet/pods/5526c286-f1f7-4e15-a94f-d064e9e4303d/volumes" Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.410156 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.444840 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zddg6\" (UniqueName: \"kubernetes.io/projected/c9ec2f53-130c-4bba-9d5a-20e873b75879-kube-api-access-zddg6\") pod \"c9ec2f53-130c-4bba-9d5a-20e873b75879\" (UID: \"c9ec2f53-130c-4bba-9d5a-20e873b75879\") " Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.445077 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ec2f53-130c-4bba-9d5a-20e873b75879-utilities\") pod \"c9ec2f53-130c-4bba-9d5a-20e873b75879\" (UID: \"c9ec2f53-130c-4bba-9d5a-20e873b75879\") " Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.445114 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ec2f53-130c-4bba-9d5a-20e873b75879-catalog-content\") pod \"c9ec2f53-130c-4bba-9d5a-20e873b75879\" (UID: \"c9ec2f53-130c-4bba-9d5a-20e873b75879\") " Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.445898 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9ec2f53-130c-4bba-9d5a-20e873b75879-utilities" (OuterVolumeSpecName: "utilities") pod "c9ec2f53-130c-4bba-9d5a-20e873b75879" (UID: "c9ec2f53-130c-4bba-9d5a-20e873b75879"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.446520 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ec2f53-130c-4bba-9d5a-20e873b75879-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.467706 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ec2f53-130c-4bba-9d5a-20e873b75879-kube-api-access-zddg6" (OuterVolumeSpecName: "kube-api-access-zddg6") pod "c9ec2f53-130c-4bba-9d5a-20e873b75879" (UID: "c9ec2f53-130c-4bba-9d5a-20e873b75879"). InnerVolumeSpecName "kube-api-access-zddg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.547760 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zddg6\" (UniqueName: \"kubernetes.io/projected/c9ec2f53-130c-4bba-9d5a-20e873b75879-kube-api-access-zddg6\") on node \"crc\" DevicePath \"\"" Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.577253 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9ec2f53-130c-4bba-9d5a-20e873b75879-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9ec2f53-130c-4bba-9d5a-20e873b75879" (UID: "c9ec2f53-130c-4bba-9d5a-20e873b75879"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.649271 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ec2f53-130c-4bba-9d5a-20e873b75879-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.936092 4851 generic.go:334] "Generic (PLEG): container finished" podID="c9ec2f53-130c-4bba-9d5a-20e873b75879" containerID="b7876e2af514e52d4e75e83806b08b9e6753f06510c13a869214ca17c7301a37" exitCode=0 Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.936134 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mj2q6" event={"ID":"c9ec2f53-130c-4bba-9d5a-20e873b75879","Type":"ContainerDied","Data":"b7876e2af514e52d4e75e83806b08b9e6753f06510c13a869214ca17c7301a37"} Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.936160 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mj2q6" event={"ID":"c9ec2f53-130c-4bba-9d5a-20e873b75879","Type":"ContainerDied","Data":"18dbdfc06d85939f06dde3978633f79d41274b931bb3dc891e5a960a13021379"} Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.936176 4851 scope.go:117] "RemoveContainer" containerID="b7876e2af514e52d4e75e83806b08b9e6753f06510c13a869214ca17c7301a37" Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.936323 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mj2q6" Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.979015 4851 scope.go:117] "RemoveContainer" containerID="b43ea5a3342130c3fae85f91d9e95135a70c0fabf4043c82dc2f55f2adfcfc89" Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.979972 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mj2q6"] Oct 01 14:41:40 crc kubenswrapper[4851]: I1001 14:41:40.992305 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mj2q6"] Oct 01 14:41:41 crc kubenswrapper[4851]: I1001 14:41:41.003560 4851 scope.go:117] "RemoveContainer" containerID="b74c49c9426e04e2820c74388f6904713b3f6d1bece22c638eb809d471c2dfab" Oct 01 14:41:41 crc kubenswrapper[4851]: I1001 14:41:41.053226 4851 scope.go:117] "RemoveContainer" containerID="b7876e2af514e52d4e75e83806b08b9e6753f06510c13a869214ca17c7301a37" Oct 01 14:41:41 crc kubenswrapper[4851]: E1001 14:41:41.053702 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7876e2af514e52d4e75e83806b08b9e6753f06510c13a869214ca17c7301a37\": container with ID starting with b7876e2af514e52d4e75e83806b08b9e6753f06510c13a869214ca17c7301a37 not found: ID does not exist" containerID="b7876e2af514e52d4e75e83806b08b9e6753f06510c13a869214ca17c7301a37" Oct 01 14:41:41 crc kubenswrapper[4851]: I1001 14:41:41.053734 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7876e2af514e52d4e75e83806b08b9e6753f06510c13a869214ca17c7301a37"} err="failed to get container status \"b7876e2af514e52d4e75e83806b08b9e6753f06510c13a869214ca17c7301a37\": rpc error: code = NotFound desc = could not find container \"b7876e2af514e52d4e75e83806b08b9e6753f06510c13a869214ca17c7301a37\": container with ID starting with b7876e2af514e52d4e75e83806b08b9e6753f06510c13a869214ca17c7301a37 not found: ID does not exist" Oct 01 14:41:41 crc kubenswrapper[4851]: I1001 14:41:41.053753 4851 scope.go:117] "RemoveContainer" containerID="b43ea5a3342130c3fae85f91d9e95135a70c0fabf4043c82dc2f55f2adfcfc89" Oct 01 14:41:41 crc kubenswrapper[4851]: E1001 14:41:41.053981 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b43ea5a3342130c3fae85f91d9e95135a70c0fabf4043c82dc2f55f2adfcfc89\": container with ID starting with b43ea5a3342130c3fae85f91d9e95135a70c0fabf4043c82dc2f55f2adfcfc89 not found: ID does not exist" containerID="b43ea5a3342130c3fae85f91d9e95135a70c0fabf4043c82dc2f55f2adfcfc89" Oct 01 14:41:41 crc kubenswrapper[4851]: I1001 14:41:41.054001 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43ea5a3342130c3fae85f91d9e95135a70c0fabf4043c82dc2f55f2adfcfc89"} err="failed to get container status \"b43ea5a3342130c3fae85f91d9e95135a70c0fabf4043c82dc2f55f2adfcfc89\": rpc error: code = NotFound desc = could not find container \"b43ea5a3342130c3fae85f91d9e95135a70c0fabf4043c82dc2f55f2adfcfc89\": container with ID starting with b43ea5a3342130c3fae85f91d9e95135a70c0fabf4043c82dc2f55f2adfcfc89 not found: ID does not exist" Oct 01 14:41:41 crc kubenswrapper[4851]: I1001 14:41:41.054013 4851 scope.go:117] "RemoveContainer" containerID="b74c49c9426e04e2820c74388f6904713b3f6d1bece22c638eb809d471c2dfab" Oct 01 14:41:41 crc kubenswrapper[4851]: E1001 14:41:41.054187 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b74c49c9426e04e2820c74388f6904713b3f6d1bece22c638eb809d471c2dfab\": container with ID starting with b74c49c9426e04e2820c74388f6904713b3f6d1bece22c638eb809d471c2dfab not found: ID does not exist" containerID="b74c49c9426e04e2820c74388f6904713b3f6d1bece22c638eb809d471c2dfab" Oct 01 14:41:41 crc kubenswrapper[4851]: I1001 14:41:41.054205 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b74c49c9426e04e2820c74388f6904713b3f6d1bece22c638eb809d471c2dfab"} err="failed to get container status \"b74c49c9426e04e2820c74388f6904713b3f6d1bece22c638eb809d471c2dfab\": rpc error: code = NotFound desc = could not find container \"b74c49c9426e04e2820c74388f6904713b3f6d1bece22c638eb809d471c2dfab\": container with ID starting with b74c49c9426e04e2820c74388f6904713b3f6d1bece22c638eb809d471c2dfab not found: ID does not exist" Oct 01 14:41:42 crc kubenswrapper[4851]: I1001 14:41:42.340058 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ec2f53-130c-4bba-9d5a-20e873b75879" path="/var/lib/kubelet/pods/c9ec2f53-130c-4bba-9d5a-20e873b75879/volumes" Oct 01 14:41:49 crc kubenswrapper[4851]: I1001 14:41:49.328928 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:41:49 crc kubenswrapper[4851]: E1001 14:41:49.329726 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:42:00 crc kubenswrapper[4851]: I1001 14:42:00.145833 4851 generic.go:334] "Generic (PLEG): container finished" podID="eaa405d4-4e1b-4b65-a906-0bd554e274d0" containerID="e1e2eae2a32e596bdac7a5350fbc36c58f075916a902ed14d0adadafde9b1f2f" exitCode=0 Oct 01 14:42:00 crc kubenswrapper[4851]: I1001 14:42:00.145922 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-95hhg/crc-debug-sg7gg" event={"ID":"eaa405d4-4e1b-4b65-a906-0bd554e274d0","Type":"ContainerDied","Data":"e1e2eae2a32e596bdac7a5350fbc36c58f075916a902ed14d0adadafde9b1f2f"} Oct 01 14:42:01 crc kubenswrapper[4851]: I1001 14:42:01.256930 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/crc-debug-sg7gg" Oct 01 14:42:01 crc kubenswrapper[4851]: I1001 14:42:01.280658 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxv2z\" (UniqueName: \"kubernetes.io/projected/eaa405d4-4e1b-4b65-a906-0bd554e274d0-kube-api-access-vxv2z\") pod \"eaa405d4-4e1b-4b65-a906-0bd554e274d0\" (UID: \"eaa405d4-4e1b-4b65-a906-0bd554e274d0\") " Oct 01 14:42:01 crc kubenswrapper[4851]: I1001 14:42:01.280852 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eaa405d4-4e1b-4b65-a906-0bd554e274d0-host\") pod \"eaa405d4-4e1b-4b65-a906-0bd554e274d0\" (UID: \"eaa405d4-4e1b-4b65-a906-0bd554e274d0\") " Oct 01 14:42:01 crc kubenswrapper[4851]: I1001 14:42:01.281009 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eaa405d4-4e1b-4b65-a906-0bd554e274d0-host" (OuterVolumeSpecName: "host") pod "eaa405d4-4e1b-4b65-a906-0bd554e274d0" (UID: "eaa405d4-4e1b-4b65-a906-0bd554e274d0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:42:01 crc kubenswrapper[4851]: I1001 14:42:01.281657 4851 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eaa405d4-4e1b-4b65-a906-0bd554e274d0-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:42:01 crc kubenswrapper[4851]: I1001 14:42:01.287435 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa405d4-4e1b-4b65-a906-0bd554e274d0-kube-api-access-vxv2z" (OuterVolumeSpecName: "kube-api-access-vxv2z") pod "eaa405d4-4e1b-4b65-a906-0bd554e274d0" (UID: "eaa405d4-4e1b-4b65-a906-0bd554e274d0"). InnerVolumeSpecName "kube-api-access-vxv2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:42:01 crc kubenswrapper[4851]: I1001 14:42:01.289343 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-95hhg/crc-debug-sg7gg"] Oct 01 14:42:01 crc kubenswrapper[4851]: I1001 14:42:01.301968 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-95hhg/crc-debug-sg7gg"] Oct 01 14:42:01 crc kubenswrapper[4851]: I1001 14:42:01.328224 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:42:01 crc kubenswrapper[4851]: E1001 14:42:01.328661 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:42:01 crc kubenswrapper[4851]: I1001 14:42:01.383017 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxv2z\" (UniqueName: \"kubernetes.io/projected/eaa405d4-4e1b-4b65-a906-0bd554e274d0-kube-api-access-vxv2z\") on node \"crc\" DevicePath \"\"" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.166651 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17311fae5bea6d1bc367153a885c422f2cb3a5d22db296592fef967a4bea99db" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.166759 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/crc-debug-sg7gg" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.346127 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaa405d4-4e1b-4b65-a906-0bd554e274d0" path="/var/lib/kubelet/pods/eaa405d4-4e1b-4b65-a906-0bd554e274d0/volumes" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.507563 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-95hhg/crc-debug-dbct7"] Oct 01 14:42:02 crc kubenswrapper[4851]: E1001 14:42:02.508102 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa405d4-4e1b-4b65-a906-0bd554e274d0" containerName="container-00" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.508129 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa405d4-4e1b-4b65-a906-0bd554e274d0" containerName="container-00" Oct 01 14:42:02 crc kubenswrapper[4851]: E1001 14:42:02.508143 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ec2f53-130c-4bba-9d5a-20e873b75879" containerName="extract-content" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.508152 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ec2f53-130c-4bba-9d5a-20e873b75879" containerName="extract-content" Oct 01 14:42:02 crc kubenswrapper[4851]: E1001 14:42:02.508273 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ec2f53-130c-4bba-9d5a-20e873b75879" containerName="extract-utilities" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.508288 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ec2f53-130c-4bba-9d5a-20e873b75879" containerName="extract-utilities" Oct 01 14:42:02 crc kubenswrapper[4851]: E1001 14:42:02.508362 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5526c286-f1f7-4e15-a94f-d064e9e4303d" containerName="registry-server" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.508374 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="5526c286-f1f7-4e15-a94f-d064e9e4303d" containerName="registry-server" Oct 01 14:42:02 crc kubenswrapper[4851]: E1001 14:42:02.508387 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ec2f53-130c-4bba-9d5a-20e873b75879" containerName="registry-server" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.508398 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ec2f53-130c-4bba-9d5a-20e873b75879" containerName="registry-server" Oct 01 14:42:02 crc kubenswrapper[4851]: E1001 14:42:02.508415 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5526c286-f1f7-4e15-a94f-d064e9e4303d" containerName="extract-content" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.508424 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="5526c286-f1f7-4e15-a94f-d064e9e4303d" containerName="extract-content" Oct 01 14:42:02 crc kubenswrapper[4851]: E1001 14:42:02.508460 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5526c286-f1f7-4e15-a94f-d064e9e4303d" containerName="extract-utilities" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.508470 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="5526c286-f1f7-4e15-a94f-d064e9e4303d" containerName="extract-utilities" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.508814 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa405d4-4e1b-4b65-a906-0bd554e274d0" containerName="container-00" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.508841 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ec2f53-130c-4bba-9d5a-20e873b75879" containerName="registry-server" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.508862 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="5526c286-f1f7-4e15-a94f-d064e9e4303d" containerName="registry-server" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.509890 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/crc-debug-dbct7" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.711326 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv8x2\" (UniqueName: \"kubernetes.io/projected/7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c-kube-api-access-kv8x2\") pod \"crc-debug-dbct7\" (UID: \"7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c\") " pod="openshift-must-gather-95hhg/crc-debug-dbct7" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.711396 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c-host\") pod \"crc-debug-dbct7\" (UID: \"7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c\") " pod="openshift-must-gather-95hhg/crc-debug-dbct7" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.814531 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv8x2\" (UniqueName: \"kubernetes.io/projected/7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c-kube-api-access-kv8x2\") pod \"crc-debug-dbct7\" (UID: \"7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c\") " pod="openshift-must-gather-95hhg/crc-debug-dbct7" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.814633 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c-host\") pod \"crc-debug-dbct7\" (UID: \"7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c\") " pod="openshift-must-gather-95hhg/crc-debug-dbct7" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.814902 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c-host\") pod \"crc-debug-dbct7\" (UID: \"7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c\") " pod="openshift-must-gather-95hhg/crc-debug-dbct7" Oct 01 14:42:02 crc kubenswrapper[4851]: I1001 14:42:02.846736 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv8x2\" (UniqueName: \"kubernetes.io/projected/7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c-kube-api-access-kv8x2\") pod \"crc-debug-dbct7\" (UID: \"7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c\") " pod="openshift-must-gather-95hhg/crc-debug-dbct7" Oct 01 14:42:03 crc kubenswrapper[4851]: I1001 14:42:03.126375 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/crc-debug-dbct7" Oct 01 14:42:03 crc kubenswrapper[4851]: I1001 14:42:03.175792 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-95hhg/crc-debug-dbct7" event={"ID":"7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c","Type":"ContainerStarted","Data":"7f31c061318506633107a25fd6c098006d05b1327dadae40be7ede964063a941"} Oct 01 14:42:04 crc kubenswrapper[4851]: I1001 14:42:04.191760 4851 generic.go:334] "Generic (PLEG): container finished" podID="7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c" containerID="39d480a7a77fd64004bc1cb76498ddfa7d0f1388dc6d39944fc906de20423caf" exitCode=0 Oct 01 14:42:04 crc kubenswrapper[4851]: I1001 14:42:04.191821 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-95hhg/crc-debug-dbct7" event={"ID":"7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c","Type":"ContainerDied","Data":"39d480a7a77fd64004bc1cb76498ddfa7d0f1388dc6d39944fc906de20423caf"} Oct 01 14:42:05 crc kubenswrapper[4851]: I1001 14:42:05.310634 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/crc-debug-dbct7" Oct 01 14:42:05 crc kubenswrapper[4851]: I1001 14:42:05.459727 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c-host\") pod \"7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c\" (UID: \"7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c\") " Oct 01 14:42:05 crc kubenswrapper[4851]: I1001 14:42:05.459833 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv8x2\" (UniqueName: \"kubernetes.io/projected/7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c-kube-api-access-kv8x2\") pod \"7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c\" (UID: \"7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c\") " Oct 01 14:42:05 crc kubenswrapper[4851]: I1001 14:42:05.459865 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c-host" (OuterVolumeSpecName: "host") pod "7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c" (UID: "7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:42:05 crc kubenswrapper[4851]: I1001 14:42:05.460605 4851 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:42:05 crc kubenswrapper[4851]: I1001 14:42:05.471559 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c-kube-api-access-kv8x2" (OuterVolumeSpecName: "kube-api-access-kv8x2") pod "7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c" (UID: "7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c"). InnerVolumeSpecName "kube-api-access-kv8x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:42:05 crc kubenswrapper[4851]: I1001 14:42:05.561984 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv8x2\" (UniqueName: \"kubernetes.io/projected/7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c-kube-api-access-kv8x2\") on node \"crc\" DevicePath \"\"" Oct 01 14:42:06 crc kubenswrapper[4851]: I1001 14:42:06.212425 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-95hhg/crc-debug-dbct7" event={"ID":"7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c","Type":"ContainerDied","Data":"7f31c061318506633107a25fd6c098006d05b1327dadae40be7ede964063a941"} Oct 01 14:42:06 crc kubenswrapper[4851]: I1001 14:42:06.212757 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f31c061318506633107a25fd6c098006d05b1327dadae40be7ede964063a941" Oct 01 14:42:06 crc kubenswrapper[4851]: I1001 14:42:06.212711 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/crc-debug-dbct7" Oct 01 14:42:13 crc kubenswrapper[4851]: I1001 14:42:13.296936 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-95hhg/crc-debug-dbct7"] Oct 01 14:42:13 crc kubenswrapper[4851]: I1001 14:42:13.305435 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-95hhg/crc-debug-dbct7"] Oct 01 14:42:14 crc kubenswrapper[4851]: I1001 14:42:14.345159 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c" path="/var/lib/kubelet/pods/7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c/volumes" Oct 01 14:42:14 crc kubenswrapper[4851]: I1001 14:42:14.688382 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-95hhg/crc-debug-g5z7p"] Oct 01 14:42:14 crc kubenswrapper[4851]: E1001 14:42:14.688839 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c" containerName="container-00" Oct 01 14:42:14 crc kubenswrapper[4851]: I1001 14:42:14.688862 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c" containerName="container-00" Oct 01 14:42:14 crc kubenswrapper[4851]: I1001 14:42:14.689094 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c15e6fd-dd7a-4a9e-aa15-3dde531b0d4c" containerName="container-00" Oct 01 14:42:14 crc kubenswrapper[4851]: I1001 14:42:14.689773 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/crc-debug-g5z7p" Oct 01 14:42:14 crc kubenswrapper[4851]: I1001 14:42:14.810601 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1d04041-4489-41b2-b2a9-b086daff19ce-host\") pod \"crc-debug-g5z7p\" (UID: \"f1d04041-4489-41b2-b2a9-b086daff19ce\") " pod="openshift-must-gather-95hhg/crc-debug-g5z7p" Oct 01 14:42:14 crc kubenswrapper[4851]: I1001 14:42:14.810932 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p624w\" (UniqueName: \"kubernetes.io/projected/f1d04041-4489-41b2-b2a9-b086daff19ce-kube-api-access-p624w\") pod \"crc-debug-g5z7p\" (UID: \"f1d04041-4489-41b2-b2a9-b086daff19ce\") " pod="openshift-must-gather-95hhg/crc-debug-g5z7p" Oct 01 14:42:14 crc kubenswrapper[4851]: I1001 14:42:14.913222 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p624w\" (UniqueName: \"kubernetes.io/projected/f1d04041-4489-41b2-b2a9-b086daff19ce-kube-api-access-p624w\") pod \"crc-debug-g5z7p\" (UID: \"f1d04041-4489-41b2-b2a9-b086daff19ce\") " pod="openshift-must-gather-95hhg/crc-debug-g5z7p" Oct 01 14:42:14 crc kubenswrapper[4851]: I1001 14:42:14.913399 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1d04041-4489-41b2-b2a9-b086daff19ce-host\") pod \"crc-debug-g5z7p\" (UID: \"f1d04041-4489-41b2-b2a9-b086daff19ce\") " pod="openshift-must-gather-95hhg/crc-debug-g5z7p" Oct 01 14:42:14 crc kubenswrapper[4851]: I1001 14:42:14.913637 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1d04041-4489-41b2-b2a9-b086daff19ce-host\") pod \"crc-debug-g5z7p\" (UID: \"f1d04041-4489-41b2-b2a9-b086daff19ce\") " pod="openshift-must-gather-95hhg/crc-debug-g5z7p" Oct 01 14:42:14 crc kubenswrapper[4851]: I1001 14:42:14.937712 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p624w\" (UniqueName: \"kubernetes.io/projected/f1d04041-4489-41b2-b2a9-b086daff19ce-kube-api-access-p624w\") pod \"crc-debug-g5z7p\" (UID: \"f1d04041-4489-41b2-b2a9-b086daff19ce\") " pod="openshift-must-gather-95hhg/crc-debug-g5z7p" Oct 01 14:42:15 crc kubenswrapper[4851]: I1001 14:42:15.010395 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/crc-debug-g5z7p" Oct 01 14:42:15 crc kubenswrapper[4851]: I1001 14:42:15.321851 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-95hhg/crc-debug-g5z7p" event={"ID":"f1d04041-4489-41b2-b2a9-b086daff19ce","Type":"ContainerStarted","Data":"a392be935ace593344671d580885a9246a2df2fdf4e1fa1a15590b8c8c0c57e4"} Oct 01 14:42:16 crc kubenswrapper[4851]: I1001 14:42:16.328020 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:42:16 crc kubenswrapper[4851]: E1001 14:42:16.328665 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:42:16 crc kubenswrapper[4851]: I1001 14:42:16.334296 4851 generic.go:334] "Generic (PLEG): container finished" podID="f1d04041-4489-41b2-b2a9-b086daff19ce" containerID="6edd680a1154a4306c959e37cf0f2250099779453398a33bd417a91c307fa9cc" exitCode=0 Oct 01 14:42:16 crc kubenswrapper[4851]: I1001 14:42:16.341786 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-95hhg/crc-debug-g5z7p" event={"ID":"f1d04041-4489-41b2-b2a9-b086daff19ce","Type":"ContainerDied","Data":"6edd680a1154a4306c959e37cf0f2250099779453398a33bd417a91c307fa9cc"} Oct 01 14:42:16 crc kubenswrapper[4851]: I1001 14:42:16.372809 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-95hhg/crc-debug-g5z7p"] Oct 01 14:42:16 crc kubenswrapper[4851]: I1001 14:42:16.380950 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-95hhg/crc-debug-g5z7p"] Oct 01 14:42:17 crc kubenswrapper[4851]: I1001 14:42:17.467978 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/crc-debug-g5z7p" Oct 01 14:42:17 crc kubenswrapper[4851]: I1001 14:42:17.562703 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p624w\" (UniqueName: \"kubernetes.io/projected/f1d04041-4489-41b2-b2a9-b086daff19ce-kube-api-access-p624w\") pod \"f1d04041-4489-41b2-b2a9-b086daff19ce\" (UID: \"f1d04041-4489-41b2-b2a9-b086daff19ce\") " Oct 01 14:42:17 crc kubenswrapper[4851]: I1001 14:42:17.562922 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1d04041-4489-41b2-b2a9-b086daff19ce-host\") pod \"f1d04041-4489-41b2-b2a9-b086daff19ce\" (UID: \"f1d04041-4489-41b2-b2a9-b086daff19ce\") " Oct 01 14:42:17 crc kubenswrapper[4851]: I1001 14:42:17.563067 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1d04041-4489-41b2-b2a9-b086daff19ce-host" (OuterVolumeSpecName: "host") pod "f1d04041-4489-41b2-b2a9-b086daff19ce" (UID: "f1d04041-4489-41b2-b2a9-b086daff19ce"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:42:17 crc kubenswrapper[4851]: I1001 14:42:17.563321 4851 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1d04041-4489-41b2-b2a9-b086daff19ce-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:42:17 crc kubenswrapper[4851]: I1001 14:42:17.568730 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1d04041-4489-41b2-b2a9-b086daff19ce-kube-api-access-p624w" (OuterVolumeSpecName: "kube-api-access-p624w") pod "f1d04041-4489-41b2-b2a9-b086daff19ce" (UID: "f1d04041-4489-41b2-b2a9-b086daff19ce"). InnerVolumeSpecName "kube-api-access-p624w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:42:17 crc kubenswrapper[4851]: I1001 14:42:17.664750 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p624w\" (UniqueName: \"kubernetes.io/projected/f1d04041-4489-41b2-b2a9-b086daff19ce-kube-api-access-p624w\") on node \"crc\" DevicePath \"\"" Oct 01 14:42:17 crc kubenswrapper[4851]: I1001 14:42:17.896471 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc_b2793519-7741-4285-b90b-2210f2c7a421/util/0.log" Oct 01 14:42:18 crc kubenswrapper[4851]: I1001 14:42:18.298952 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc_b2793519-7741-4285-b90b-2210f2c7a421/util/0.log" Oct 01 14:42:18 crc kubenswrapper[4851]: I1001 14:42:18.313369 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc_b2793519-7741-4285-b90b-2210f2c7a421/pull/0.log" Oct 01 14:42:18 crc kubenswrapper[4851]: I1001 14:42:18.338268 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc_b2793519-7741-4285-b90b-2210f2c7a421/pull/0.log" Oct 01 14:42:18 crc kubenswrapper[4851]: I1001 14:42:18.339218 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1d04041-4489-41b2-b2a9-b086daff19ce" path="/var/lib/kubelet/pods/f1d04041-4489-41b2-b2a9-b086daff19ce/volumes" Oct 01 14:42:18 crc kubenswrapper[4851]: I1001 14:42:18.353694 4851 scope.go:117] "RemoveContainer" containerID="6edd680a1154a4306c959e37cf0f2250099779453398a33bd417a91c307fa9cc" Oct 01 14:42:18 crc kubenswrapper[4851]: I1001 14:42:18.353731 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/crc-debug-g5z7p" Oct 01 14:42:18 crc kubenswrapper[4851]: I1001 14:42:18.451015 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc_b2793519-7741-4285-b90b-2210f2c7a421/util/0.log" Oct 01 14:42:18 crc kubenswrapper[4851]: I1001 14:42:18.539379 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc_b2793519-7741-4285-b90b-2210f2c7a421/pull/0.log" Oct 01 14:42:18 crc kubenswrapper[4851]: I1001 14:42:18.552404 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d66f115c97aa440e40c67023e371297e16a8965bcd1bacaf8006a59cc46xbc_b2793519-7741-4285-b90b-2210f2c7a421/extract/0.log" Oct 01 14:42:18 crc kubenswrapper[4851]: I1001 14:42:18.634839 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-nxjn7_613abfd6-27d5-4f52-bad5-024d71335465/kube-rbac-proxy/0.log" Oct 01 14:42:18 crc kubenswrapper[4851]: I1001 14:42:18.742334 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-ms6rc_08cf0ffd-5ec4-4406-a912-319a1c9ced15/kube-rbac-proxy/0.log" Oct 01 14:42:18 crc kubenswrapper[4851]: I1001 14:42:18.790885 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-nxjn7_613abfd6-27d5-4f52-bad5-024d71335465/manager/0.log" Oct 01 14:42:18 crc kubenswrapper[4851]: I1001 14:42:18.860786 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-ms6rc_08cf0ffd-5ec4-4406-a912-319a1c9ced15/manager/0.log" Oct 01 14:42:18 crc kubenswrapper[4851]: I1001 14:42:18.954643 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-knmsq_4dcb48dd-5baf-415e-861e-ebfa40fc2e84/kube-rbac-proxy/0.log" Oct 01 14:42:19 crc kubenswrapper[4851]: I1001 14:42:19.000977 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-knmsq_4dcb48dd-5baf-415e-861e-ebfa40fc2e84/manager/0.log" Oct 01 14:42:19 crc kubenswrapper[4851]: I1001 14:42:19.150897 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-8c9zh_a1e24a95-87b4-4f06-b651-9c8c26a7021d/kube-rbac-proxy/0.log" Oct 01 14:42:19 crc kubenswrapper[4851]: I1001 14:42:19.241344 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-8c9zh_a1e24a95-87b4-4f06-b651-9c8c26a7021d/manager/0.log" Oct 01 14:42:19 crc kubenswrapper[4851]: I1001 14:42:19.294008 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-r7gqw_ba805aa7-4c6e-4dc1-8de8-c935ab1c2128/kube-rbac-proxy/0.log" Oct 01 14:42:19 crc kubenswrapper[4851]: I1001 14:42:19.362313 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-r7gqw_ba805aa7-4c6e-4dc1-8de8-c935ab1c2128/manager/0.log" Oct 01 14:42:19 crc kubenswrapper[4851]: I1001 14:42:19.431623 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-tssnp_a01e01ff-d00e-482d-8901-72c0705672f1/kube-rbac-proxy/0.log" Oct 01 14:42:19 crc kubenswrapper[4851]: I1001 14:42:19.515154 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-tssnp_a01e01ff-d00e-482d-8901-72c0705672f1/manager/0.log" Oct 01 14:42:19 crc kubenswrapper[4851]: I1001 14:42:19.638178 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-4chzf_63075165-70d9-4dd4-9b52-41e59e59fcee/kube-rbac-proxy/0.log" Oct 01 14:42:19 crc kubenswrapper[4851]: I1001 14:42:19.831809 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-dq52c_ac9e1ccb-a68d-446e-b47b-de00d828f332/kube-rbac-proxy/0.log" Oct 01 14:42:19 crc kubenswrapper[4851]: I1001 14:42:19.839179 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-dq52c_ac9e1ccb-a68d-446e-b47b-de00d828f332/manager/0.log" Oct 01 14:42:19 crc kubenswrapper[4851]: I1001 14:42:19.869917 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-4chzf_63075165-70d9-4dd4-9b52-41e59e59fcee/manager/0.log" Oct 01 14:42:20 crc kubenswrapper[4851]: I1001 14:42:20.071460 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-flp6z_a7cf8be8-c21f-4904-838e-f185857ef960/manager/0.log" Oct 01 14:42:20 crc kubenswrapper[4851]: I1001 14:42:20.094213 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-flp6z_a7cf8be8-c21f-4904-838e-f185857ef960/kube-rbac-proxy/0.log" Oct 01 14:42:20 crc kubenswrapper[4851]: I1001 14:42:20.139220 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-zx88h_324c39a8-85b5-4caf-a719-a5f47a827d08/kube-rbac-proxy/0.log" Oct 01 14:42:20 crc kubenswrapper[4851]: I1001 14:42:20.261683 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-zx88h_324c39a8-85b5-4caf-a719-a5f47a827d08/manager/0.log" Oct 01 14:42:20 crc kubenswrapper[4851]: I1001 14:42:20.328819 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-qjqpv_de177c71-0c78-416b-a62f-2d73d86a2b70/kube-rbac-proxy/0.log" Oct 01 14:42:20 crc kubenswrapper[4851]: I1001 14:42:20.339332 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-qjqpv_de177c71-0c78-416b-a62f-2d73d86a2b70/manager/0.log" Oct 01 14:42:20 crc kubenswrapper[4851]: I1001 14:42:20.504228 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-22n8s_5d973d57-3aa0-4d14-9c4a-435f6ff880af/kube-rbac-proxy/0.log" Oct 01 14:42:20 crc kubenswrapper[4851]: I1001 14:42:20.562478 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-22n8s_5d973d57-3aa0-4d14-9c4a-435f6ff880af/manager/0.log" Oct 01 14:42:20 crc kubenswrapper[4851]: I1001 14:42:20.683854 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-mkxn2_7abbaca1-f067-43b8-a24e-0219ce7e7eaa/kube-rbac-proxy/0.log" Oct 01 14:42:20 crc kubenswrapper[4851]: I1001 14:42:20.781584 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-qxlj6_ee6067ef-427a-49d8-99b6-694930e44a0d/kube-rbac-proxy/0.log" Oct 01 14:42:20 crc kubenswrapper[4851]: I1001 14:42:20.784701 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-mkxn2_7abbaca1-f067-43b8-a24e-0219ce7e7eaa/manager/0.log" Oct 01 14:42:20 crc kubenswrapper[4851]: I1001 14:42:20.923656 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-qxlj6_ee6067ef-427a-49d8-99b6-694930e44a0d/manager/0.log" Oct 01 14:42:20 crc kubenswrapper[4851]: I1001 14:42:20.999239 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8ctmstz_9ac4cdbb-6f06-4525-8f5a-61f81a230708/manager/0.log" Oct 01 14:42:21 crc kubenswrapper[4851]: I1001 14:42:21.004946 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8ctmstz_9ac4cdbb-6f06-4525-8f5a-61f81a230708/kube-rbac-proxy/0.log" Oct 01 14:42:21 crc kubenswrapper[4851]: I1001 14:42:21.160302 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5cc886c7f9-x6rkr_e6e332ad-4dcd-4526-abf5-c79bfce4ee72/kube-rbac-proxy/0.log" Oct 01 14:42:21 crc kubenswrapper[4851]: I1001 14:42:21.288386 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-548cc7d4f7-ll79v_fdc19a2b-5602-40b0-a49d-22d56b9724d7/kube-rbac-proxy/0.log" Oct 01 14:42:21 crc kubenswrapper[4851]: I1001 14:42:21.485830 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-d96ms_8a61efd7-e35e-49f6-880c-e2d18b49d157/registry-server/0.log" Oct 01 14:42:21 crc kubenswrapper[4851]: I1001 14:42:21.489650 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-548cc7d4f7-ll79v_fdc19a2b-5602-40b0-a49d-22d56b9724d7/operator/0.log" Oct 01 14:42:21 crc kubenswrapper[4851]: I1001 14:42:21.647352 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-c8r2z_f3be6466-a46d-49b7-a5e4-9465c82ce165/kube-rbac-proxy/0.log" Oct 01 14:42:21 crc kubenswrapper[4851]: I1001 14:42:21.796765 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-cslt7_d7ddf969-982c-4436-85e4-fb963d57a385/kube-rbac-proxy/0.log" Oct 01 14:42:21 crc kubenswrapper[4851]: I1001 14:42:21.817485 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-c8r2z_f3be6466-a46d-49b7-a5e4-9465c82ce165/manager/0.log" Oct 01 14:42:21 crc kubenswrapper[4851]: I1001 14:42:21.915991 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-cslt7_d7ddf969-982c-4436-85e4-fb963d57a385/manager/0.log" Oct 01 14:42:22 crc kubenswrapper[4851]: I1001 14:42:22.025368 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-26s54_d08d3102-97ef-4224-8a21-fa66c86a2f11/operator/0.log" Oct 01 14:42:22 crc kubenswrapper[4851]: I1001 14:42:22.165140 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-62xmc_9389dca9-fd43-4f83-a3a8-b755859b252e/kube-rbac-proxy/0.log" Oct 01 14:42:22 crc kubenswrapper[4851]: I1001 14:42:22.258907 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-62xmc_9389dca9-fd43-4f83-a3a8-b755859b252e/manager/0.log" Oct 01 14:42:22 crc kubenswrapper[4851]: I1001 14:42:22.396756 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5cc886c7f9-x6rkr_e6e332ad-4dcd-4526-abf5-c79bfce4ee72/manager/0.log" Oct 01 14:42:22 crc kubenswrapper[4851]: I1001 14:42:22.457775 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-zzts5_c9b7da57-564a-4053-a476-08db0d87c317/kube-rbac-proxy/0.log" Oct 01 14:42:22 crc kubenswrapper[4851]: I1001 14:42:22.634931 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-62wb7_5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0/kube-rbac-proxy/0.log" Oct 01 14:42:22 crc kubenswrapper[4851]: I1001 14:42:22.712687 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-zzts5_c9b7da57-564a-4053-a476-08db0d87c317/manager/0.log" Oct 01 14:42:22 crc kubenswrapper[4851]: I1001 14:42:22.720828 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-62wb7_5f0e4ca8-a97f-4a47-9ead-3e2c33881ef0/manager/0.log" Oct 01 14:42:22 crc kubenswrapper[4851]: I1001 14:42:22.840163 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-d64f8f9f6-2qx7f_60b302ff-f344-4556-92dc-e8f7954b80c9/kube-rbac-proxy/0.log" Oct 01 14:42:22 crc kubenswrapper[4851]: I1001 14:42:22.890416 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-d64f8f9f6-2qx7f_60b302ff-f344-4556-92dc-e8f7954b80c9/manager/0.log" Oct 01 14:42:29 crc kubenswrapper[4851]: I1001 14:42:29.328704 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:42:29 crc kubenswrapper[4851]: E1001 14:42:29.329593 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:42:38 crc kubenswrapper[4851]: I1001 14:42:38.663857 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bhp8c_c7ff570b-9b52-4d2f-b030-40eca1804794/control-plane-machine-set-operator/0.log" Oct 01 14:42:38 crc kubenswrapper[4851]: I1001 14:42:38.856530 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-68zzg_0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697/kube-rbac-proxy/0.log" Oct 01 14:42:38 crc kubenswrapper[4851]: I1001 14:42:38.902567 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-68zzg_0ed3ddbf-1afc-4704-9ad0-29ae4e3a6697/machine-api-operator/0.log" Oct 01 14:42:44 crc kubenswrapper[4851]: I1001 14:42:44.329214 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:42:44 crc kubenswrapper[4851]: E1001 14:42:44.330092 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:42:50 crc kubenswrapper[4851]: I1001 14:42:50.827212 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-26dms_435110f0-cfd8-423b-afdc-a9cfe0426a3a/cert-manager-controller/0.log" Oct 01 14:42:51 crc kubenswrapper[4851]: I1001 14:42:51.013910 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-dhkqx_097254a7-5e3a-4d5b-90d5-f6f1ae0b4c3d/cert-manager-webhook/0.log" Oct 01 14:42:51 crc kubenswrapper[4851]: I1001 14:42:51.014122 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-d6clp_f24da60a-9cbf-482f-a764-43145245be50/cert-manager-cainjector/0.log" Oct 01 14:42:58 crc kubenswrapper[4851]: I1001 14:42:58.328839 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:42:58 crc kubenswrapper[4851]: E1001 14:42:58.329830 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:43:02 crc kubenswrapper[4851]: I1001 14:43:02.420637 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-9kpmg_874ffc52-8446-4804-baa4-b75d2348af0d/nmstate-console-plugin/0.log" Oct 01 14:43:02 crc kubenswrapper[4851]: I1001 14:43:02.560323 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-z4ptw_fccb3980-5aa4-4221-8f25-c14c68913a81/nmstate-handler/0.log" Oct 01 14:43:02 crc kubenswrapper[4851]: I1001 14:43:02.625643 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-2bcp8_94133d01-7539-4ee5-9151-62f52ec7a1e8/nmstate-metrics/0.log" Oct 01 14:43:02 crc kubenswrapper[4851]: I1001 14:43:02.629425 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-2bcp8_94133d01-7539-4ee5-9151-62f52ec7a1e8/kube-rbac-proxy/0.log" Oct 01 14:43:02 crc kubenswrapper[4851]: I1001 14:43:02.778276 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-p2vrr_92d2d84c-4ae5-4943-9eeb-f88f900ec658/nmstate-operator/0.log" Oct 01 14:43:02 crc kubenswrapper[4851]: I1001 14:43:02.804437 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-zpvld_f4752bf8-2355-4922-92d2-5546fd2c4340/nmstate-webhook/0.log" Oct 01 14:43:12 crc kubenswrapper[4851]: I1001 14:43:12.345945 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:43:12 crc kubenswrapper[4851]: E1001 14:43:12.346813 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:43:16 crc kubenswrapper[4851]: I1001 14:43:16.634183 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-wrnrn_8516a5e5-0b45-41d2-baa6-2680bc58eb9b/kube-rbac-proxy/0.log" Oct 01 14:43:16 crc kubenswrapper[4851]: I1001 14:43:16.794531 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-wrnrn_8516a5e5-0b45-41d2-baa6-2680bc58eb9b/controller/0.log" Oct 01 14:43:16 crc kubenswrapper[4851]: I1001 14:43:16.833506 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-frr-files/0.log" Oct 01 14:43:16 crc kubenswrapper[4851]: I1001 14:43:16.946214 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-metrics/0.log" Oct 01 14:43:16 crc kubenswrapper[4851]: I1001 14:43:16.971288 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-frr-files/0.log" Oct 01 14:43:16 crc kubenswrapper[4851]: I1001 14:43:16.979705 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-reloader/0.log" Oct 01 14:43:17 crc kubenswrapper[4851]: I1001 14:43:17.001430 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-reloader/0.log" Oct 01 14:43:17 crc kubenswrapper[4851]: I1001 14:43:17.159139 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-frr-files/0.log" Oct 01 14:43:17 crc kubenswrapper[4851]: I1001 14:43:17.191385 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-metrics/0.log" Oct 01 14:43:17 crc kubenswrapper[4851]: I1001 14:43:17.198092 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-reloader/0.log" Oct 01 14:43:17 crc kubenswrapper[4851]: I1001 14:43:17.249370 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-metrics/0.log" Oct 01 14:43:17 crc kubenswrapper[4851]: I1001 14:43:17.355100 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-frr-files/0.log" Oct 01 14:43:17 crc kubenswrapper[4851]: I1001 14:43:17.409555 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-metrics/0.log" Oct 01 14:43:17 crc kubenswrapper[4851]: I1001 14:43:17.409959 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/cp-reloader/0.log" Oct 01 14:43:17 crc kubenswrapper[4851]: I1001 14:43:17.442267 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/controller/0.log" Oct 01 14:43:17 crc kubenswrapper[4851]: I1001 14:43:17.612461 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/frr-metrics/0.log" Oct 01 14:43:17 crc kubenswrapper[4851]: I1001 14:43:17.628452 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/kube-rbac-proxy/0.log" Oct 01 14:43:17 crc kubenswrapper[4851]: I1001 14:43:17.653966 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/kube-rbac-proxy-frr/0.log" Oct 01 14:43:17 crc kubenswrapper[4851]: I1001 14:43:17.815995 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/reloader/0.log" Oct 01 14:43:17 crc kubenswrapper[4851]: I1001 14:43:17.867313 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-wqlkq_35ad8647-b8e1-4935-a669-ae418db665b7/frr-k8s-webhook-server/0.log" Oct 01 14:43:18 crc kubenswrapper[4851]: I1001 14:43:18.097617 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-bc7c7cbf4-vznd6_f4a3620c-32ab-4a6b-9a3b-0ca2e11868e9/manager/0.log" Oct 01 14:43:18 crc kubenswrapper[4851]: I1001 14:43:18.193149 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57f9d579c4-n58rf_f8382d60-6b68-4f9a-8aa0-8fefd3c2de4c/webhook-server/0.log" Oct 01 14:43:18 crc kubenswrapper[4851]: I1001 14:43:18.312108 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bgphr_d8ce22ba-fa18-4924-9c48-360cd16f0857/kube-rbac-proxy/0.log" Oct 01 14:43:18 crc kubenswrapper[4851]: I1001 14:43:18.953683 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bgphr_d8ce22ba-fa18-4924-9c48-360cd16f0857/speaker/0.log" Oct 01 14:43:19 crc kubenswrapper[4851]: I1001 14:43:19.269591 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jcrr_e3ef3f0b-665a-491b-857a-5ee2c5614f90/frr/0.log" Oct 01 14:43:26 crc kubenswrapper[4851]: I1001 14:43:26.328114 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:43:26 crc kubenswrapper[4851]: E1001 14:43:26.328669 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:43:31 crc kubenswrapper[4851]: I1001 14:43:31.067514 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8_127a1393-40c1-4ee6-84a6-9ccd6dee9595/util/0.log" Oct 01 14:43:31 crc kubenswrapper[4851]: I1001 14:43:31.263579 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8_127a1393-40c1-4ee6-84a6-9ccd6dee9595/util/0.log" Oct 01 14:43:31 crc kubenswrapper[4851]: I1001 14:43:31.289071 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8_127a1393-40c1-4ee6-84a6-9ccd6dee9595/pull/0.log" Oct 01 14:43:31 crc kubenswrapper[4851]: I1001 14:43:31.315487 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8_127a1393-40c1-4ee6-84a6-9ccd6dee9595/pull/0.log" Oct 01 14:43:31 crc kubenswrapper[4851]: I1001 14:43:31.487290 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8_127a1393-40c1-4ee6-84a6-9ccd6dee9595/util/0.log" Oct 01 14:43:31 crc kubenswrapper[4851]: I1001 14:43:31.492492 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8_127a1393-40c1-4ee6-84a6-9ccd6dee9595/pull/0.log" Oct 01 14:43:31 crc kubenswrapper[4851]: I1001 14:43:31.509001 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcv82n8_127a1393-40c1-4ee6-84a6-9ccd6dee9595/extract/0.log" Oct 01 14:43:31 crc kubenswrapper[4851]: I1001 14:43:31.639864 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_d3aab3b2-075f-4afa-969e-3e32803601b9/util/0.log" Oct 01 14:43:31 crc kubenswrapper[4851]: I1001 14:43:31.816773 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_d3aab3b2-075f-4afa-969e-3e32803601b9/pull/0.log" Oct 01 14:43:31 crc kubenswrapper[4851]: I1001 14:43:31.832870 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_d3aab3b2-075f-4afa-969e-3e32803601b9/pull/0.log" Oct 01 14:43:31 crc kubenswrapper[4851]: I1001 14:43:31.842875 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_d3aab3b2-075f-4afa-969e-3e32803601b9/util/0.log" Oct 01 14:43:32 crc kubenswrapper[4851]: I1001 14:43:32.015124 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_d3aab3b2-075f-4afa-969e-3e32803601b9/util/0.log" Oct 01 14:43:32 crc kubenswrapper[4851]: I1001 14:43:32.051341 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_d3aab3b2-075f-4afa-969e-3e32803601b9/extract/0.log" Oct 01 14:43:32 crc kubenswrapper[4851]: I1001 14:43:32.073250 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d42fs9_d3aab3b2-075f-4afa-969e-3e32803601b9/pull/0.log" Oct 01 14:43:32 crc kubenswrapper[4851]: I1001 14:43:32.191273 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9mgxt_70558de8-d877-4f24-a9ee-18f3696799a8/extract-utilities/0.log" Oct 01 14:43:32 crc kubenswrapper[4851]: I1001 14:43:32.371220 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9mgxt_70558de8-d877-4f24-a9ee-18f3696799a8/extract-utilities/0.log" Oct 01 14:43:32 crc kubenswrapper[4851]: I1001 14:43:32.385868 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9mgxt_70558de8-d877-4f24-a9ee-18f3696799a8/extract-content/0.log" Oct 01 14:43:32 crc kubenswrapper[4851]: I1001 14:43:32.419993 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9mgxt_70558de8-d877-4f24-a9ee-18f3696799a8/extract-content/0.log" Oct 01 14:43:32 crc kubenswrapper[4851]: I1001 14:43:32.577038 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9mgxt_70558de8-d877-4f24-a9ee-18f3696799a8/extract-content/0.log" Oct 01 14:43:32 crc kubenswrapper[4851]: I1001 14:43:32.616971 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9mgxt_70558de8-d877-4f24-a9ee-18f3696799a8/extract-utilities/0.log" Oct 01 14:43:32 crc kubenswrapper[4851]: I1001 14:43:32.801621 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgwl2_60fbd453-e33b-463e-8dc0-e82251bdec0d/extract-utilities/0.log" Oct 01 14:43:33 crc kubenswrapper[4851]: I1001 14:43:33.250264 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgwl2_60fbd453-e33b-463e-8dc0-e82251bdec0d/extract-content/0.log" Oct 01 14:43:33 crc kubenswrapper[4851]: I1001 14:43:33.250460 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgwl2_60fbd453-e33b-463e-8dc0-e82251bdec0d/extract-utilities/0.log" Oct 01 14:43:33 crc kubenswrapper[4851]: I1001 14:43:33.254201 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9mgxt_70558de8-d877-4f24-a9ee-18f3696799a8/registry-server/0.log" Oct 01 14:43:33 crc kubenswrapper[4851]: I1001 14:43:33.305571 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgwl2_60fbd453-e33b-463e-8dc0-e82251bdec0d/extract-content/0.log" Oct 01 14:43:33 crc kubenswrapper[4851]: I1001 14:43:33.456531 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgwl2_60fbd453-e33b-463e-8dc0-e82251bdec0d/extract-content/0.log" Oct 01 14:43:33 crc kubenswrapper[4851]: I1001 14:43:33.551522 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgwl2_60fbd453-e33b-463e-8dc0-e82251bdec0d/extract-utilities/0.log" Oct 01 14:43:33 crc kubenswrapper[4851]: I1001 14:43:33.786888 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf_ddb9987b-6077-405c-bcc2-15d713fb434d/util/0.log" Oct 01 14:43:34 crc kubenswrapper[4851]: I1001 14:43:34.030829 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf_ddb9987b-6077-405c-bcc2-15d713fb434d/util/0.log" Oct 01 14:43:34 crc kubenswrapper[4851]: I1001 14:43:34.032026 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf_ddb9987b-6077-405c-bcc2-15d713fb434d/pull/0.log" Oct 01 14:43:34 crc kubenswrapper[4851]: I1001 14:43:34.058685 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf_ddb9987b-6077-405c-bcc2-15d713fb434d/pull/0.log" Oct 01 14:43:34 crc kubenswrapper[4851]: I1001 14:43:34.278701 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf_ddb9987b-6077-405c-bcc2-15d713fb434d/extract/0.log" Oct 01 14:43:34 crc kubenswrapper[4851]: I1001 14:43:34.312996 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf_ddb9987b-6077-405c-bcc2-15d713fb434d/pull/0.log" Oct 01 14:43:34 crc kubenswrapper[4851]: I1001 14:43:34.323843 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rb6gf_ddb9987b-6077-405c-bcc2-15d713fb434d/util/0.log" Oct 01 14:43:34 crc kubenswrapper[4851]: I1001 14:43:34.567374 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgwl2_60fbd453-e33b-463e-8dc0-e82251bdec0d/registry-server/0.log" Oct 01 14:43:34 crc kubenswrapper[4851]: I1001 14:43:34.572968 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xtrc2_0e26c58b-4c8a-4b07-96d9-7dea3f32ed8d/marketplace-operator/0.log" Oct 01 14:43:34 crc kubenswrapper[4851]: I1001 14:43:34.722416 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-knk85_b535b20d-942b-417e-8580-f29cebe2401f/extract-utilities/0.log" Oct 01 14:43:34 crc kubenswrapper[4851]: I1001 14:43:34.905945 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-knk85_b535b20d-942b-417e-8580-f29cebe2401f/extract-content/0.log" Oct 01 14:43:34 crc kubenswrapper[4851]: I1001 14:43:34.919258 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-knk85_b535b20d-942b-417e-8580-f29cebe2401f/extract-content/0.log" Oct 01 14:43:34 crc kubenswrapper[4851]: I1001 14:43:34.920576 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-knk85_b535b20d-942b-417e-8580-f29cebe2401f/extract-utilities/0.log" Oct 01 14:43:35 crc kubenswrapper[4851]: I1001 14:43:35.089521 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-knk85_b535b20d-942b-417e-8580-f29cebe2401f/extract-content/0.log" Oct 01 14:43:35 crc kubenswrapper[4851]: I1001 14:43:35.147085 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-knk85_b535b20d-942b-417e-8580-f29cebe2401f/extract-utilities/0.log" Oct 01 14:43:35 crc kubenswrapper[4851]: I1001 14:43:35.159751 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnmp_4c1a9063-23de-46a7-bd5e-8763dee075c4/extract-utilities/0.log" Oct 01 14:43:35 crc kubenswrapper[4851]: I1001 14:43:35.346958 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-knk85_b535b20d-942b-417e-8580-f29cebe2401f/registry-server/0.log" Oct 01 14:43:35 crc kubenswrapper[4851]: I1001 14:43:35.394534 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnmp_4c1a9063-23de-46a7-bd5e-8763dee075c4/extract-content/0.log" Oct 01 14:43:35 crc kubenswrapper[4851]: I1001 14:43:35.403267 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnmp_4c1a9063-23de-46a7-bd5e-8763dee075c4/extract-utilities/0.log" Oct 01 14:43:35 crc kubenswrapper[4851]: I1001 14:43:35.420922 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnmp_4c1a9063-23de-46a7-bd5e-8763dee075c4/extract-content/0.log" Oct 01 14:43:35 crc kubenswrapper[4851]: I1001 14:43:35.579653 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnmp_4c1a9063-23de-46a7-bd5e-8763dee075c4/extract-content/0.log" Oct 01 14:43:35 crc kubenswrapper[4851]: I1001 14:43:35.652813 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnmp_4c1a9063-23de-46a7-bd5e-8763dee075c4/extract-utilities/0.log" Oct 01 14:43:36 crc kubenswrapper[4851]: I1001 14:43:36.319945 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnmp_4c1a9063-23de-46a7-bd5e-8763dee075c4/registry-server/0.log" Oct 01 14:43:41 crc kubenswrapper[4851]: I1001 14:43:41.329406 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:43:41 crc kubenswrapper[4851]: E1001 14:43:41.332925 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:43:48 crc kubenswrapper[4851]: I1001 14:43:48.808720 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-2tptp_6dc06d35-7a66-4c6a-bd91-1de635f2160f/prometheus-operator/0.log" Oct 01 14:43:48 crc kubenswrapper[4851]: I1001 14:43:48.979680 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6dd5789d46-mf765_5797d328-2922-4183-86d5-7d237952df39/prometheus-operator-admission-webhook/0.log" Oct 01 14:43:49 crc kubenswrapper[4851]: I1001 14:43:49.014251 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6dd5789d46-l5l7n_fd26d552-d668-4eee-b51c-468b0f48d5e7/prometheus-operator-admission-webhook/0.log" Oct 01 14:43:49 crc kubenswrapper[4851]: I1001 14:43:49.193364 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-9lt4m_d6d5593d-78e7-4efe-a8b1-f56bd214301b/operator/0.log" Oct 01 14:43:49 crc kubenswrapper[4851]: I1001 14:43:49.257768 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-t5ld5_6280e32a-cded-420b-8a4b-f65505578cf0/perses-operator/0.log" Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.496605 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gn5s7"] Oct 01 14:43:55 crc kubenswrapper[4851]: E1001 14:43:55.497360 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1d04041-4489-41b2-b2a9-b086daff19ce" containerName="container-00" Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.497372 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d04041-4489-41b2-b2a9-b086daff19ce" containerName="container-00" Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.497599 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1d04041-4489-41b2-b2a9-b086daff19ce" containerName="container-00" Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.499049 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.522118 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gn5s7"] Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.565340 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/261e38ea-cb5c-438a-9490-884e9cdedfeb-catalog-content\") pod \"redhat-operators-gn5s7\" (UID: \"261e38ea-cb5c-438a-9490-884e9cdedfeb\") " pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.565427 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/261e38ea-cb5c-438a-9490-884e9cdedfeb-utilities\") pod \"redhat-operators-gn5s7\" (UID: \"261e38ea-cb5c-438a-9490-884e9cdedfeb\") " pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.565451 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdt7k\" (UniqueName: \"kubernetes.io/projected/261e38ea-cb5c-438a-9490-884e9cdedfeb-kube-api-access-fdt7k\") pod \"redhat-operators-gn5s7\" (UID: \"261e38ea-cb5c-438a-9490-884e9cdedfeb\") " pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.667570 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/261e38ea-cb5c-438a-9490-884e9cdedfeb-utilities\") pod \"redhat-operators-gn5s7\" (UID: \"261e38ea-cb5c-438a-9490-884e9cdedfeb\") " pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.667618 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdt7k\" (UniqueName: \"kubernetes.io/projected/261e38ea-cb5c-438a-9490-884e9cdedfeb-kube-api-access-fdt7k\") pod \"redhat-operators-gn5s7\" (UID: \"261e38ea-cb5c-438a-9490-884e9cdedfeb\") " pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.667744 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/261e38ea-cb5c-438a-9490-884e9cdedfeb-catalog-content\") pod \"redhat-operators-gn5s7\" (UID: \"261e38ea-cb5c-438a-9490-884e9cdedfeb\") " pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.668109 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/261e38ea-cb5c-438a-9490-884e9cdedfeb-utilities\") pod \"redhat-operators-gn5s7\" (UID: \"261e38ea-cb5c-438a-9490-884e9cdedfeb\") " pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.668172 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/261e38ea-cb5c-438a-9490-884e9cdedfeb-catalog-content\") pod \"redhat-operators-gn5s7\" (UID: \"261e38ea-cb5c-438a-9490-884e9cdedfeb\") " pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.686922 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdt7k\" (UniqueName: \"kubernetes.io/projected/261e38ea-cb5c-438a-9490-884e9cdedfeb-kube-api-access-fdt7k\") pod \"redhat-operators-gn5s7\" (UID: \"261e38ea-cb5c-438a-9490-884e9cdedfeb\") " pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:43:55 crc kubenswrapper[4851]: I1001 14:43:55.816202 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:43:56 crc kubenswrapper[4851]: I1001 14:43:56.332434 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:43:56 crc kubenswrapper[4851]: E1001 14:43:56.333103 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:43:56 crc kubenswrapper[4851]: I1001 14:43:56.349825 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gn5s7"] Oct 01 14:43:57 crc kubenswrapper[4851]: I1001 14:43:57.390561 4851 generic.go:334] "Generic (PLEG): container finished" podID="261e38ea-cb5c-438a-9490-884e9cdedfeb" containerID="01502d09479cddace3d3c4a853dc97a98a424c0ecd98a44caa93cf59f21b5baa" exitCode=0 Oct 01 14:43:57 crc kubenswrapper[4851]: I1001 14:43:57.390851 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gn5s7" event={"ID":"261e38ea-cb5c-438a-9490-884e9cdedfeb","Type":"ContainerDied","Data":"01502d09479cddace3d3c4a853dc97a98a424c0ecd98a44caa93cf59f21b5baa"} Oct 01 14:43:57 crc kubenswrapper[4851]: I1001 14:43:57.390878 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gn5s7" event={"ID":"261e38ea-cb5c-438a-9490-884e9cdedfeb","Type":"ContainerStarted","Data":"4a8a0a207ff66ab5b81aaa780d95074e2da7acd3b3b94ee4ece9893d29834834"} Oct 01 14:43:58 crc kubenswrapper[4851]: I1001 14:43:58.403021 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gn5s7" event={"ID":"261e38ea-cb5c-438a-9490-884e9cdedfeb","Type":"ContainerStarted","Data":"7789be2017265e93ca953f30d8046cc9e22cdb926786d68ef6cb1f7ab70cb1e6"} Oct 01 14:44:02 crc kubenswrapper[4851]: I1001 14:44:02.451022 4851 generic.go:334] "Generic (PLEG): container finished" podID="261e38ea-cb5c-438a-9490-884e9cdedfeb" containerID="7789be2017265e93ca953f30d8046cc9e22cdb926786d68ef6cb1f7ab70cb1e6" exitCode=0 Oct 01 14:44:02 crc kubenswrapper[4851]: I1001 14:44:02.451104 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gn5s7" event={"ID":"261e38ea-cb5c-438a-9490-884e9cdedfeb","Type":"ContainerDied","Data":"7789be2017265e93ca953f30d8046cc9e22cdb926786d68ef6cb1f7ab70cb1e6"} Oct 01 14:44:03 crc kubenswrapper[4851]: I1001 14:44:03.462770 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gn5s7" event={"ID":"261e38ea-cb5c-438a-9490-884e9cdedfeb","Type":"ContainerStarted","Data":"7710766cd9ce0081db5656bb779ffce3904b5cdf4948c90cb37067eb3d8d0fa8"} Oct 01 14:44:03 crc kubenswrapper[4851]: I1001 14:44:03.490493 4851 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gn5s7" podStartSLOduration=3.007425528 podStartE2EDuration="8.490471892s" podCreationTimestamp="2025-10-01 14:43:55 +0000 UTC" firstStartedPulling="2025-10-01 14:43:57.397656355 +0000 UTC m=+6645.742773841" lastFinishedPulling="2025-10-01 14:44:02.880702719 +0000 UTC m=+6651.225820205" observedRunningTime="2025-10-01 14:44:03.487016374 +0000 UTC m=+6651.832133860" watchObservedRunningTime="2025-10-01 14:44:03.490471892 +0000 UTC m=+6651.835589378" Oct 01 14:44:05 crc kubenswrapper[4851]: I1001 14:44:05.817197 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:44:05 crc kubenswrapper[4851]: I1001 14:44:05.817858 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:44:06 crc kubenswrapper[4851]: I1001 14:44:06.880458 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gn5s7" podUID="261e38ea-cb5c-438a-9490-884e9cdedfeb" containerName="registry-server" probeResult="failure" output=< Oct 01 14:44:06 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 14:44:06 crc kubenswrapper[4851]: > Oct 01 14:44:08 crc kubenswrapper[4851]: I1001 14:44:08.328720 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:44:08 crc kubenswrapper[4851]: E1001 14:44:08.329312 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:44:16 crc kubenswrapper[4851]: I1001 14:44:16.877320 4851 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gn5s7" podUID="261e38ea-cb5c-438a-9490-884e9cdedfeb" containerName="registry-server" probeResult="failure" output=< Oct 01 14:44:16 crc kubenswrapper[4851]: timeout: failed to connect service ":50051" within 1s Oct 01 14:44:16 crc kubenswrapper[4851]: > Oct 01 14:44:21 crc kubenswrapper[4851]: I1001 14:44:21.329925 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:44:21 crc kubenswrapper[4851]: E1001 14:44:21.331068 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:44:25 crc kubenswrapper[4851]: I1001 14:44:25.879531 4851 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:44:25 crc kubenswrapper[4851]: I1001 14:44:25.971280 4851 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:44:26 crc kubenswrapper[4851]: I1001 14:44:26.708384 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gn5s7"] Oct 01 14:44:27 crc kubenswrapper[4851]: I1001 14:44:27.732622 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gn5s7" podUID="261e38ea-cb5c-438a-9490-884e9cdedfeb" containerName="registry-server" containerID="cri-o://7710766cd9ce0081db5656bb779ffce3904b5cdf4948c90cb37067eb3d8d0fa8" gracePeriod=2 Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.263904 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.357282 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/261e38ea-cb5c-438a-9490-884e9cdedfeb-catalog-content\") pod \"261e38ea-cb5c-438a-9490-884e9cdedfeb\" (UID: \"261e38ea-cb5c-438a-9490-884e9cdedfeb\") " Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.357472 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/261e38ea-cb5c-438a-9490-884e9cdedfeb-utilities\") pod \"261e38ea-cb5c-438a-9490-884e9cdedfeb\" (UID: \"261e38ea-cb5c-438a-9490-884e9cdedfeb\") " Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.357538 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdt7k\" (UniqueName: \"kubernetes.io/projected/261e38ea-cb5c-438a-9490-884e9cdedfeb-kube-api-access-fdt7k\") pod \"261e38ea-cb5c-438a-9490-884e9cdedfeb\" (UID: \"261e38ea-cb5c-438a-9490-884e9cdedfeb\") " Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.358403 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/261e38ea-cb5c-438a-9490-884e9cdedfeb-utilities" (OuterVolumeSpecName: "utilities") pod "261e38ea-cb5c-438a-9490-884e9cdedfeb" (UID: "261e38ea-cb5c-438a-9490-884e9cdedfeb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.362810 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/261e38ea-cb5c-438a-9490-884e9cdedfeb-kube-api-access-fdt7k" (OuterVolumeSpecName: "kube-api-access-fdt7k") pod "261e38ea-cb5c-438a-9490-884e9cdedfeb" (UID: "261e38ea-cb5c-438a-9490-884e9cdedfeb"). InnerVolumeSpecName "kube-api-access-fdt7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.454454 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/261e38ea-cb5c-438a-9490-884e9cdedfeb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "261e38ea-cb5c-438a-9490-884e9cdedfeb" (UID: "261e38ea-cb5c-438a-9490-884e9cdedfeb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.465972 4851 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/261e38ea-cb5c-438a-9490-884e9cdedfeb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.466114 4851 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/261e38ea-cb5c-438a-9490-884e9cdedfeb-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.466195 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdt7k\" (UniqueName: \"kubernetes.io/projected/261e38ea-cb5c-438a-9490-884e9cdedfeb-kube-api-access-fdt7k\") on node \"crc\" DevicePath \"\"" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.747851 4851 generic.go:334] "Generic (PLEG): container finished" podID="261e38ea-cb5c-438a-9490-884e9cdedfeb" containerID="7710766cd9ce0081db5656bb779ffce3904b5cdf4948c90cb37067eb3d8d0fa8" exitCode=0 Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.747905 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gn5s7" event={"ID":"261e38ea-cb5c-438a-9490-884e9cdedfeb","Type":"ContainerDied","Data":"7710766cd9ce0081db5656bb779ffce3904b5cdf4948c90cb37067eb3d8d0fa8"} Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.748268 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gn5s7" event={"ID":"261e38ea-cb5c-438a-9490-884e9cdedfeb","Type":"ContainerDied","Data":"4a8a0a207ff66ab5b81aaa780d95074e2da7acd3b3b94ee4ece9893d29834834"} Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.747953 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gn5s7" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.748287 4851 scope.go:117] "RemoveContainer" containerID="7710766cd9ce0081db5656bb779ffce3904b5cdf4948c90cb37067eb3d8d0fa8" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.813140 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gn5s7"] Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.815001 4851 scope.go:117] "RemoveContainer" containerID="7789be2017265e93ca953f30d8046cc9e22cdb926786d68ef6cb1f7ab70cb1e6" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.825266 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gn5s7"] Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.854097 4851 scope.go:117] "RemoveContainer" containerID="01502d09479cddace3d3c4a853dc97a98a424c0ecd98a44caa93cf59f21b5baa" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.925671 4851 scope.go:117] "RemoveContainer" containerID="7710766cd9ce0081db5656bb779ffce3904b5cdf4948c90cb37067eb3d8d0fa8" Oct 01 14:44:28 crc kubenswrapper[4851]: E1001 14:44:28.933635 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7710766cd9ce0081db5656bb779ffce3904b5cdf4948c90cb37067eb3d8d0fa8\": container with ID starting with 7710766cd9ce0081db5656bb779ffce3904b5cdf4948c90cb37067eb3d8d0fa8 not found: ID does not exist" containerID="7710766cd9ce0081db5656bb779ffce3904b5cdf4948c90cb37067eb3d8d0fa8" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.933687 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7710766cd9ce0081db5656bb779ffce3904b5cdf4948c90cb37067eb3d8d0fa8"} err="failed to get container status \"7710766cd9ce0081db5656bb779ffce3904b5cdf4948c90cb37067eb3d8d0fa8\": rpc error: code = NotFound desc = could not find container \"7710766cd9ce0081db5656bb779ffce3904b5cdf4948c90cb37067eb3d8d0fa8\": container with ID starting with 7710766cd9ce0081db5656bb779ffce3904b5cdf4948c90cb37067eb3d8d0fa8 not found: ID does not exist" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.933715 4851 scope.go:117] "RemoveContainer" containerID="7789be2017265e93ca953f30d8046cc9e22cdb926786d68ef6cb1f7ab70cb1e6" Oct 01 14:44:28 crc kubenswrapper[4851]: E1001 14:44:28.937578 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7789be2017265e93ca953f30d8046cc9e22cdb926786d68ef6cb1f7ab70cb1e6\": container with ID starting with 7789be2017265e93ca953f30d8046cc9e22cdb926786d68ef6cb1f7ab70cb1e6 not found: ID does not exist" containerID="7789be2017265e93ca953f30d8046cc9e22cdb926786d68ef6cb1f7ab70cb1e6" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.937609 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7789be2017265e93ca953f30d8046cc9e22cdb926786d68ef6cb1f7ab70cb1e6"} err="failed to get container status \"7789be2017265e93ca953f30d8046cc9e22cdb926786d68ef6cb1f7ab70cb1e6\": rpc error: code = NotFound desc = could not find container \"7789be2017265e93ca953f30d8046cc9e22cdb926786d68ef6cb1f7ab70cb1e6\": container with ID starting with 7789be2017265e93ca953f30d8046cc9e22cdb926786d68ef6cb1f7ab70cb1e6 not found: ID does not exist" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.937626 4851 scope.go:117] "RemoveContainer" containerID="01502d09479cddace3d3c4a853dc97a98a424c0ecd98a44caa93cf59f21b5baa" Oct 01 14:44:28 crc kubenswrapper[4851]: E1001 14:44:28.946651 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01502d09479cddace3d3c4a853dc97a98a424c0ecd98a44caa93cf59f21b5baa\": container with ID starting with 01502d09479cddace3d3c4a853dc97a98a424c0ecd98a44caa93cf59f21b5baa not found: ID does not exist" containerID="01502d09479cddace3d3c4a853dc97a98a424c0ecd98a44caa93cf59f21b5baa" Oct 01 14:44:28 crc kubenswrapper[4851]: I1001 14:44:28.946703 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01502d09479cddace3d3c4a853dc97a98a424c0ecd98a44caa93cf59f21b5baa"} err="failed to get container status \"01502d09479cddace3d3c4a853dc97a98a424c0ecd98a44caa93cf59f21b5baa\": rpc error: code = NotFound desc = could not find container \"01502d09479cddace3d3c4a853dc97a98a424c0ecd98a44caa93cf59f21b5baa\": container with ID starting with 01502d09479cddace3d3c4a853dc97a98a424c0ecd98a44caa93cf59f21b5baa not found: ID does not exist" Oct 01 14:44:30 crc kubenswrapper[4851]: I1001 14:44:30.345325 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="261e38ea-cb5c-438a-9490-884e9cdedfeb" path="/var/lib/kubelet/pods/261e38ea-cb5c-438a-9490-884e9cdedfeb/volumes" Oct 01 14:44:34 crc kubenswrapper[4851]: I1001 14:44:34.328425 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:44:34 crc kubenswrapper[4851]: E1001 14:44:34.329217 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:44:49 crc kubenswrapper[4851]: I1001 14:44:49.328397 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:44:49 crc kubenswrapper[4851]: E1001 14:44:49.329360 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.222229 4851 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq"] Oct 01 14:45:00 crc kubenswrapper[4851]: E1001 14:45:00.223360 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261e38ea-cb5c-438a-9490-884e9cdedfeb" containerName="extract-content" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.223377 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="261e38ea-cb5c-438a-9490-884e9cdedfeb" containerName="extract-content" Oct 01 14:45:00 crc kubenswrapper[4851]: E1001 14:45:00.223415 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261e38ea-cb5c-438a-9490-884e9cdedfeb" containerName="extract-utilities" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.223425 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="261e38ea-cb5c-438a-9490-884e9cdedfeb" containerName="extract-utilities" Oct 01 14:45:00 crc kubenswrapper[4851]: E1001 14:45:00.223453 4851 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261e38ea-cb5c-438a-9490-884e9cdedfeb" containerName="registry-server" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.223463 4851 state_mem.go:107] "Deleted CPUSet assignment" podUID="261e38ea-cb5c-438a-9490-884e9cdedfeb" containerName="registry-server" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.223771 4851 memory_manager.go:354] "RemoveStaleState removing state" podUID="261e38ea-cb5c-438a-9490-884e9cdedfeb" containerName="registry-server" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.224624 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.227372 4851 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.227617 4851 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.230900 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq"] Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.267008 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26c1fe51-8f4b-4ed7-adaf-982e91662163-secret-volume\") pod \"collect-profiles-29322165-62wbq\" (UID: \"26c1fe51-8f4b-4ed7-adaf-982e91662163\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.267255 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqlwx\" (UniqueName: \"kubernetes.io/projected/26c1fe51-8f4b-4ed7-adaf-982e91662163-kube-api-access-gqlwx\") pod \"collect-profiles-29322165-62wbq\" (UID: \"26c1fe51-8f4b-4ed7-adaf-982e91662163\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.267459 4851 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26c1fe51-8f4b-4ed7-adaf-982e91662163-config-volume\") pod \"collect-profiles-29322165-62wbq\" (UID: \"26c1fe51-8f4b-4ed7-adaf-982e91662163\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.330712 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:45:00 crc kubenswrapper[4851]: E1001 14:45:00.331038 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.369671 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26c1fe51-8f4b-4ed7-adaf-982e91662163-secret-volume\") pod \"collect-profiles-29322165-62wbq\" (UID: \"26c1fe51-8f4b-4ed7-adaf-982e91662163\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.369942 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqlwx\" (UniqueName: \"kubernetes.io/projected/26c1fe51-8f4b-4ed7-adaf-982e91662163-kube-api-access-gqlwx\") pod \"collect-profiles-29322165-62wbq\" (UID: \"26c1fe51-8f4b-4ed7-adaf-982e91662163\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.370922 4851 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26c1fe51-8f4b-4ed7-adaf-982e91662163-config-volume\") pod \"collect-profiles-29322165-62wbq\" (UID: \"26c1fe51-8f4b-4ed7-adaf-982e91662163\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.371589 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26c1fe51-8f4b-4ed7-adaf-982e91662163-config-volume\") pod \"collect-profiles-29322165-62wbq\" (UID: \"26c1fe51-8f4b-4ed7-adaf-982e91662163\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.383315 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26c1fe51-8f4b-4ed7-adaf-982e91662163-secret-volume\") pod \"collect-profiles-29322165-62wbq\" (UID: \"26c1fe51-8f4b-4ed7-adaf-982e91662163\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.395130 4851 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqlwx\" (UniqueName: \"kubernetes.io/projected/26c1fe51-8f4b-4ed7-adaf-982e91662163-kube-api-access-gqlwx\") pod \"collect-profiles-29322165-62wbq\" (UID: \"26c1fe51-8f4b-4ed7-adaf-982e91662163\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" Oct 01 14:45:00 crc kubenswrapper[4851]: I1001 14:45:00.554326 4851 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" Oct 01 14:45:01 crc kubenswrapper[4851]: I1001 14:45:01.055462 4851 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq"] Oct 01 14:45:01 crc kubenswrapper[4851]: W1001 14:45:01.064397 4851 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26c1fe51_8f4b_4ed7_adaf_982e91662163.slice/crio-7b3475f68fec738c8a31dbc5eaa46e1be571c4afaa13c42132ab9bbced7d51cb WatchSource:0}: Error finding container 7b3475f68fec738c8a31dbc5eaa46e1be571c4afaa13c42132ab9bbced7d51cb: Status 404 returned error can't find the container with id 7b3475f68fec738c8a31dbc5eaa46e1be571c4afaa13c42132ab9bbced7d51cb Oct 01 14:45:01 crc kubenswrapper[4851]: I1001 14:45:01.160756 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" event={"ID":"26c1fe51-8f4b-4ed7-adaf-982e91662163","Type":"ContainerStarted","Data":"7b3475f68fec738c8a31dbc5eaa46e1be571c4afaa13c42132ab9bbced7d51cb"} Oct 01 14:45:02 crc kubenswrapper[4851]: I1001 14:45:02.173476 4851 generic.go:334] "Generic (PLEG): container finished" podID="26c1fe51-8f4b-4ed7-adaf-982e91662163" containerID="2fcaf977b619724f707eaf699e23df1bc9d8fee1afe67f67a30c1a5337a8cad6" exitCode=0 Oct 01 14:45:02 crc kubenswrapper[4851]: I1001 14:45:02.174701 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" event={"ID":"26c1fe51-8f4b-4ed7-adaf-982e91662163","Type":"ContainerDied","Data":"2fcaf977b619724f707eaf699e23df1bc9d8fee1afe67f67a30c1a5337a8cad6"} Oct 01 14:45:03 crc kubenswrapper[4851]: I1001 14:45:03.553007 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" Oct 01 14:45:03 crc kubenswrapper[4851]: I1001 14:45:03.639839 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqlwx\" (UniqueName: \"kubernetes.io/projected/26c1fe51-8f4b-4ed7-adaf-982e91662163-kube-api-access-gqlwx\") pod \"26c1fe51-8f4b-4ed7-adaf-982e91662163\" (UID: \"26c1fe51-8f4b-4ed7-adaf-982e91662163\") " Oct 01 14:45:03 crc kubenswrapper[4851]: I1001 14:45:03.639942 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26c1fe51-8f4b-4ed7-adaf-982e91662163-secret-volume\") pod \"26c1fe51-8f4b-4ed7-adaf-982e91662163\" (UID: \"26c1fe51-8f4b-4ed7-adaf-982e91662163\") " Oct 01 14:45:03 crc kubenswrapper[4851]: I1001 14:45:03.640048 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26c1fe51-8f4b-4ed7-adaf-982e91662163-config-volume\") pod \"26c1fe51-8f4b-4ed7-adaf-982e91662163\" (UID: \"26c1fe51-8f4b-4ed7-adaf-982e91662163\") " Oct 01 14:45:03 crc kubenswrapper[4851]: I1001 14:45:03.640952 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26c1fe51-8f4b-4ed7-adaf-982e91662163-config-volume" (OuterVolumeSpecName: "config-volume") pod "26c1fe51-8f4b-4ed7-adaf-982e91662163" (UID: "26c1fe51-8f4b-4ed7-adaf-982e91662163"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:45:03 crc kubenswrapper[4851]: I1001 14:45:03.646484 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26c1fe51-8f4b-4ed7-adaf-982e91662163-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "26c1fe51-8f4b-4ed7-adaf-982e91662163" (UID: "26c1fe51-8f4b-4ed7-adaf-982e91662163"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:45:03 crc kubenswrapper[4851]: I1001 14:45:03.648563 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c1fe51-8f4b-4ed7-adaf-982e91662163-kube-api-access-gqlwx" (OuterVolumeSpecName: "kube-api-access-gqlwx") pod "26c1fe51-8f4b-4ed7-adaf-982e91662163" (UID: "26c1fe51-8f4b-4ed7-adaf-982e91662163"). InnerVolumeSpecName "kube-api-access-gqlwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:45:03 crc kubenswrapper[4851]: I1001 14:45:03.742881 4851 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26c1fe51-8f4b-4ed7-adaf-982e91662163-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:45:03 crc kubenswrapper[4851]: I1001 14:45:03.743140 4851 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26c1fe51-8f4b-4ed7-adaf-982e91662163-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:45:03 crc kubenswrapper[4851]: I1001 14:45:03.743220 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqlwx\" (UniqueName: \"kubernetes.io/projected/26c1fe51-8f4b-4ed7-adaf-982e91662163-kube-api-access-gqlwx\") on node \"crc\" DevicePath \"\"" Oct 01 14:45:04 crc kubenswrapper[4851]: I1001 14:45:04.203620 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" event={"ID":"26c1fe51-8f4b-4ed7-adaf-982e91662163","Type":"ContainerDied","Data":"7b3475f68fec738c8a31dbc5eaa46e1be571c4afaa13c42132ab9bbced7d51cb"} Oct 01 14:45:04 crc kubenswrapper[4851]: I1001 14:45:04.203695 4851 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b3475f68fec738c8a31dbc5eaa46e1be571c4afaa13c42132ab9bbced7d51cb" Oct 01 14:45:04 crc kubenswrapper[4851]: I1001 14:45:04.204127 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-62wbq" Oct 01 14:45:04 crc kubenswrapper[4851]: I1001 14:45:04.667026 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb"] Oct 01 14:45:04 crc kubenswrapper[4851]: I1001 14:45:04.677913 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-m9zzb"] Oct 01 14:45:06 crc kubenswrapper[4851]: I1001 14:45:06.353284 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac388880-3f2a-4a6c-868c-3bca0c63e6c3" path="/var/lib/kubelet/pods/ac388880-3f2a-4a6c-868c-3bca0c63e6c3/volumes" Oct 01 14:45:11 crc kubenswrapper[4851]: I1001 14:45:11.329401 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:45:11 crc kubenswrapper[4851]: E1001 14:45:11.330565 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:45:22 crc kubenswrapper[4851]: I1001 14:45:22.021735 4851 scope.go:117] "RemoveContainer" containerID="3a435c68fd4ab84ed0f3737c30ce8f5d248bca6805200e12465545e62830cc0f" Oct 01 14:45:26 crc kubenswrapper[4851]: I1001 14:45:26.328363 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:45:26 crc kubenswrapper[4851]: E1001 14:45:26.329549 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:45:41 crc kubenswrapper[4851]: I1001 14:45:41.328369 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:45:41 crc kubenswrapper[4851]: E1001 14:45:41.329074 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:45:54 crc kubenswrapper[4851]: I1001 14:45:54.329604 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:45:54 crc kubenswrapper[4851]: E1001 14:45:54.331016 4851 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv72m_openshift-machine-config-operator(f3acff5c-c60b-4f54-acfa-5521ded8b2af)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" podUID="f3acff5c-c60b-4f54-acfa-5521ded8b2af" Oct 01 14:46:00 crc kubenswrapper[4851]: I1001 14:46:00.929716 4851 generic.go:334] "Generic (PLEG): container finished" podID="480965d4-fec5-4956-9747-11204e8a03ba" containerID="f6338c28d3c07f7b1a0cfa8c65fdea614c532ebbb181411b0a625cf5389f30a1" exitCode=0 Oct 01 14:46:00 crc kubenswrapper[4851]: I1001 14:46:00.929844 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-95hhg/must-gather-d2kfp" event={"ID":"480965d4-fec5-4956-9747-11204e8a03ba","Type":"ContainerDied","Data":"f6338c28d3c07f7b1a0cfa8c65fdea614c532ebbb181411b0a625cf5389f30a1"} Oct 01 14:46:00 crc kubenswrapper[4851]: I1001 14:46:00.931118 4851 scope.go:117] "RemoveContainer" containerID="f6338c28d3c07f7b1a0cfa8c65fdea614c532ebbb181411b0a625cf5389f30a1" Oct 01 14:46:01 crc kubenswrapper[4851]: I1001 14:46:01.213446 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-95hhg_must-gather-d2kfp_480965d4-fec5-4956-9747-11204e8a03ba/gather/0.log" Oct 01 14:46:06 crc kubenswrapper[4851]: I1001 14:46:06.329038 4851 scope.go:117] "RemoveContainer" containerID="c48385807fde0e7f4409ba4c7bce09bc1772db9e1402e6343eef39294abcbd11" Oct 01 14:46:06 crc kubenswrapper[4851]: I1001 14:46:06.998811 4851 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv72m" event={"ID":"f3acff5c-c60b-4f54-acfa-5521ded8b2af","Type":"ContainerStarted","Data":"6fa8d68aab36b8ba50c69616fd2a3219ccbfd848ac69c39dde8d4547f3b1fe0e"} Oct 01 14:46:14 crc kubenswrapper[4851]: I1001 14:46:14.591406 4851 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-95hhg/must-gather-d2kfp"] Oct 01 14:46:14 crc kubenswrapper[4851]: I1001 14:46:14.592442 4851 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-95hhg/must-gather-d2kfp" podUID="480965d4-fec5-4956-9747-11204e8a03ba" containerName="copy" containerID="cri-o://7879b2c565288508fc4be33c71502e51b6fb2410aee9c9bd8c7010a6212bea84" gracePeriod=2 Oct 01 14:46:14 crc kubenswrapper[4851]: I1001 14:46:14.611751 4851 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-95hhg/must-gather-d2kfp"] Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.014599 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-95hhg_must-gather-d2kfp_480965d4-fec5-4956-9747-11204e8a03ba/copy/0.log" Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.015180 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/must-gather-d2kfp" Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.114618 4851 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-95hhg_must-gather-d2kfp_480965d4-fec5-4956-9747-11204e8a03ba/copy/0.log" Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.114907 4851 generic.go:334] "Generic (PLEG): container finished" podID="480965d4-fec5-4956-9747-11204e8a03ba" containerID="7879b2c565288508fc4be33c71502e51b6fb2410aee9c9bd8c7010a6212bea84" exitCode=143 Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.114949 4851 scope.go:117] "RemoveContainer" containerID="7879b2c565288508fc4be33c71502e51b6fb2410aee9c9bd8c7010a6212bea84" Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.115067 4851 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95hhg/must-gather-d2kfp" Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.130382 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/480965d4-fec5-4956-9747-11204e8a03ba-must-gather-output\") pod \"480965d4-fec5-4956-9747-11204e8a03ba\" (UID: \"480965d4-fec5-4956-9747-11204e8a03ba\") " Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.130467 4851 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddz9r\" (UniqueName: \"kubernetes.io/projected/480965d4-fec5-4956-9747-11204e8a03ba-kube-api-access-ddz9r\") pod \"480965d4-fec5-4956-9747-11204e8a03ba\" (UID: \"480965d4-fec5-4956-9747-11204e8a03ba\") " Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.143251 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480965d4-fec5-4956-9747-11204e8a03ba-kube-api-access-ddz9r" (OuterVolumeSpecName: "kube-api-access-ddz9r") pod "480965d4-fec5-4956-9747-11204e8a03ba" (UID: "480965d4-fec5-4956-9747-11204e8a03ba"). InnerVolumeSpecName "kube-api-access-ddz9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.223591 4851 scope.go:117] "RemoveContainer" containerID="f6338c28d3c07f7b1a0cfa8c65fdea614c532ebbb181411b0a625cf5389f30a1" Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.232266 4851 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddz9r\" (UniqueName: \"kubernetes.io/projected/480965d4-fec5-4956-9747-11204e8a03ba-kube-api-access-ddz9r\") on node \"crc\" DevicePath \"\"" Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.266743 4851 scope.go:117] "RemoveContainer" containerID="7879b2c565288508fc4be33c71502e51b6fb2410aee9c9bd8c7010a6212bea84" Oct 01 14:46:15 crc kubenswrapper[4851]: E1001 14:46:15.267241 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7879b2c565288508fc4be33c71502e51b6fb2410aee9c9bd8c7010a6212bea84\": container with ID starting with 7879b2c565288508fc4be33c71502e51b6fb2410aee9c9bd8c7010a6212bea84 not found: ID does not exist" containerID="7879b2c565288508fc4be33c71502e51b6fb2410aee9c9bd8c7010a6212bea84" Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.267273 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7879b2c565288508fc4be33c71502e51b6fb2410aee9c9bd8c7010a6212bea84"} err="failed to get container status \"7879b2c565288508fc4be33c71502e51b6fb2410aee9c9bd8c7010a6212bea84\": rpc error: code = NotFound desc = could not find container \"7879b2c565288508fc4be33c71502e51b6fb2410aee9c9bd8c7010a6212bea84\": container with ID starting with 7879b2c565288508fc4be33c71502e51b6fb2410aee9c9bd8c7010a6212bea84 not found: ID does not exist" Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.267295 4851 scope.go:117] "RemoveContainer" containerID="f6338c28d3c07f7b1a0cfa8c65fdea614c532ebbb181411b0a625cf5389f30a1" Oct 01 14:46:15 crc kubenswrapper[4851]: E1001 14:46:15.267472 4851 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6338c28d3c07f7b1a0cfa8c65fdea614c532ebbb181411b0a625cf5389f30a1\": container with ID starting with f6338c28d3c07f7b1a0cfa8c65fdea614c532ebbb181411b0a625cf5389f30a1 not found: ID does not exist" containerID="f6338c28d3c07f7b1a0cfa8c65fdea614c532ebbb181411b0a625cf5389f30a1" Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.267493 4851 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6338c28d3c07f7b1a0cfa8c65fdea614c532ebbb181411b0a625cf5389f30a1"} err="failed to get container status \"f6338c28d3c07f7b1a0cfa8c65fdea614c532ebbb181411b0a625cf5389f30a1\": rpc error: code = NotFound desc = could not find container \"f6338c28d3c07f7b1a0cfa8c65fdea614c532ebbb181411b0a625cf5389f30a1\": container with ID starting with f6338c28d3c07f7b1a0cfa8c65fdea614c532ebbb181411b0a625cf5389f30a1 not found: ID does not exist" Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.380967 4851 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/480965d4-fec5-4956-9747-11204e8a03ba-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "480965d4-fec5-4956-9747-11204e8a03ba" (UID: "480965d4-fec5-4956-9747-11204e8a03ba"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:46:15 crc kubenswrapper[4851]: I1001 14:46:15.435884 4851 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/480965d4-fec5-4956-9747-11204e8a03ba-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 14:46:16 crc kubenswrapper[4851]: I1001 14:46:16.341084 4851 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="480965d4-fec5-4956-9747-11204e8a03ba" path="/var/lib/kubelet/pods/480965d4-fec5-4956-9747-11204e8a03ba/volumes" Oct 01 14:46:22 crc kubenswrapper[4851]: I1001 14:46:22.135762 4851 scope.go:117] "RemoveContainer" containerID="e1e2eae2a32e596bdac7a5350fbc36c58f075916a902ed14d0adadafde9b1f2f"